Sign in

Data Developer

Herndon, Virginia, 20170, United States
May 29, 2018

Contact this candidate


Amol 703-***-****


Around 10 years of IT experience with strong background in Database Development, Designing Data models and staging.

Experience in gathering business requirements and participating in JAD sessions.

Expert knowledge of Software Development Life Cycle (SDLC) and have hands on experience in all phases.

Have experience in methodologies of Data Warehouse and Data Integration projects.

Have good experience in strategizing and implementing staging for ODS and Data Marts.

Excellent with Data Quality assurance and also designed and implemented Data governance process, Data cleaning and data mining process.

Experience in writing ETL data mapping, ETL design and co-coordinating and also have worked with ETL processes in creating ETL specification documents and also for Data Migration from relational database sources to oracle.

Experience in providing support to application teams in all environments (Dev, Test, Acpt, Prod and Contingency).

Also worked in assisting change release management team for enterprise operational changes occurred.

Experience working as Project lead role for major and minor projects running in enterprise model.

Experience of supporting over 200+ applications for enterprise data warehouse.

Good in Analyzing Data sources, Brainstorming Business Intelligence, Dimensional Data Modeling, Logical/Physical Design, and building Entity-Relationship Diagrams.

Strong Development experience in building database objects like packages, procedures, tables, triggers, Views, Materialized Views.

Developed Autosys Jils to automate required reports to run on requested times.

Good in Unix Shell Scripting.

Good experience in tuning long running SQL queries and improve performance of queries.

Experience in using SQL*PLUS, SQL*Loader, EXPLAIN PLAN, TK PROF, TOAD

Have good documentation skills and delivered functional as well as technical documentation.

Have good hands on MS Excel working with all the functions.

Strong team member with good communication and analytical skills.





Database Specialties

Data Analysis, Enterprise Data Warehouse, Database Design and Modeling, Data Integration and Migration, ETL Architecture and DesignData Warehouse, OLTP, OLAP, ODS, Data Marts, Report Design and Development.

Data Analysis

User Requirement Gathering, JAD Sessions, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis.

Data Modeling / ETL Tools

Erwin 7x, Toad, SQL Developer, Informatica, CLoverETL, Talend, AB-Initio GDE 1.15/3.2, MS-Visio 2010.

Programming Languages

Oracle PL/SQL,MS-SQL, MySql


Oracle (9i, 10g, 11g), SQL Server, MS-SQL, MS-Access,Teradata.

Web technologies


Scripting Languages

UNIX Shell Script.

Web Servers

IIS 6.0, Apache Tomcat 5.5

Operating Systems

Windows-95,98,2000/XP/NT/Vista, Advanced-Server 2003, Fedora v-12 and VAX/VMS v5.2


College Board (Nov 2015 – Till date)

Sr ETL Developer (Informatica)

State & District Bulk Registration/ Online Assessment (Digital Pilot Launch)


Coordinated with Program and Business teams to provide input on architecture around the Registration and Fulfillment jobs for Enterprise Bulk Registration system.

Created Design workflow diagrams and documentation for Design/Operations/Production maintenance.

Used Informatica power center to study the initial setup of the mappings and workflows for EBR programs. Worked with the team to modify and update the mappings and transformation logic to fit the new business requirements.

Designed and developed work flows on Informatica designed for Online Assessment project for Digital Pilot.

Used Informatica Workflow Monitor for start kicking the jobs and monitoring the jobs for statuses.

Debug the Informatica workflows for the currently working processes for any performance issues caused.

Later moved to Appian to monitor the jobs. So have experience in executing the jobs via Appian.

Work closely with CSR/Program/DBA teams to investigate and suggest solutions/workarounds for various SAT/PSAT/EBR/TOS issues. Currently doing adhoc data analysis for business to outline the scope for future releases and build new workflows and mappings as needed.

Involved in understanding, debugging and making changes to existing SQL procedures to fix the problems with online assessments Project.

Work on various adhoc time-sensitive data requests. Assisted Program team with early-commit requirements for generating duplicate SAT admission tickets for Oct admin.

Collaborate with team members and other teams on various day-to-day tasks.

Get a holistic understanding of various EBRE release changes and work actively with Business team in preparation for future scope and ETL changes.

Environment: Informatica PowerCenter(Informatica Workflow Designer, Informatica Workflow Monitor, Appian, DBeaver, Toad, Oracle 119, AWS, HP ITSM, ClearCase, ClearQuest.

Fannie Mae (BSLM) (Jun 2013 – Oct 2015)

ETL - Ab Initio Consultant/ Data & SDLC Analyst

BSLM SLA Management: Business Service Level Management (BSLM) is a Fannie Mae proprietary product to track and administer Service level agreements (SLA) between different teams. It replaced the existing manual handoff and is used to generate and maintain SLA documents. It has built-in approval workflow; backend data is in Oracle DB and is governed through Ab-Initio.

Interface Management & Remedy Sync: Asset and Interface level information was synced from CMDB Remedy Oracle database into BSLM Oracle DB. Execution status/times of all BSLM registered jobs running in Fannie Mae are tracked via this tool and the application team and the downstream consumers of the interface are auto-notified in case of delay/failure. It has built-in forecast engine to translate the dates and actual loading of execution data.


Worked closely with Business team in the requirements phase for both Interface/SLA Management projects. Suggested upfront alternatives for non-feasible approaches.

Interacted with Business and DBA teams to build the Data Model for BSLM. Made further enhancements to the model for SLA management project.

Created Architecture and Design workflow diagrams and documentation for Design/Operations/Production maintenance.

Worked on Forecast engine logic in Ab-Initio to translate custom dates/calendar information to real-time dates. It also involved building ETL logic to cater to various business requirements related to SLA measurement and identify/notify customers in case of delay/failure of the job.

Remedy Sync project involved complex business challenges while syncing interface level data from CMDB to BSLM. Legacy data had to be dealt-with carefully.

Parallel with the process development efforts, was also very actively involved in various adhoc requests from business. These ranged from doing research for logic on specific fields or processes, creating additional reconciliation reports for analysis etc. using Ab-Initio and shell scripts.

Worked on Tableau Metadata tables and SQL implementation.

Developed, maintain and manage advanced reporting, analytics, dashboards and other BI solutions using Tableau.

Worked on various backend SQL queries for various business-specific SLA management reconciliation reports.

Coordinated with upstream and downstream work flows and Test/UAT teams as part of the monthly/quarterly release efforts for each project

Developed Unix ksh scripts for writing wrappers, SQL execution scripts in Data Loading DBMOD Processes.

Involved in making small scale enhancements to existing Business Objects reports

Used ClearCase to check-in files and code. Resolved system test defect tickets in ClearQuest.

Environment: Ab Initio GDE 1.15, 1.14, Co-op 2.15, 2.14, 3.2, UNIX shell scripting, SQL, Oracle 11g/10g/9i, Sun Solaris/AIX, Autosys, Visio, Business Objects, ClearCase, ClearQuest, SQLDevloper/Toad.

IBM (GSA – SAM.GOV), VA (Oct 2012 – May 2013)

Data Analyst/ ETL Developer is the App/Data integration for General Service Administration (Fed Team)‘s four departments of CCR, ELPS, FedReg and ORCA. The platform is for the award system to Business Entities having any kind of deal with US Fed Govt and maintaining NPA/Non-NPA records.


Understand the business processes, relate application functionality to the business processes and device solutions to address the problems

Modulates the function document into a technical document with the necessary mapping requirement and also estimate the conversion time.

Designed, Developed and Implemented physical design in the existing database using Reverse Engineering and verifying the naming standards using Abbreviation Tool.

Designed a Metadata mapping Document for Application to Repository. And also developed Metadata for Data migration from multiple legacy sources to current database.

Designed, developed and implemented the Hierarchy Relational Model for entities and also designed and implemented a relational model for monitoring of the Interfaces (Daily/Monthly/Yearly/5yealy Clover jobs).

Created CloverETL Graphs and process to relate the requirement feasible to Data model and DBA concern.

Designed, developed and deployed the clover ETL graphs for CCR and FedReg relations for data integration task.

Provided an efficient interface with brand for all processes across various platforms for data integrations and migration process for all 4 departments.

Profile, monitor and tune CloverETL objects and also administer and maintain the jobs as well as the work related with the server.

Works with the DBA in defining the structure of the objects and also the optimization benefits to ensure better performance.

Worked on performance tuning of the procedures.

Worked as a Production Support as On -Call 24/7 support for any issues related to the project.

Designed, developed and Implemented Data Quality/ Data Profiling Job on Plsql for all the extracts.

Designed some Data cleaning and Data mining graphs for some apps datas across project.

Worked on the Security control process and as a Database support to the APP Team for Security Issues by creating procedures and functions to generate report analysis for Hacker tracking.

Environment: Oracle 11g, Teradata, MS Office 2007, Windows 7, SQL Developer, Erwin, CloverETL 3.2/4.1, Unix, Netezza.

DHHS, STATE OF MAINE, Augusta, ME (Feb 2012 – Oct 2012)

Data Analyst / ETL Developer

Data Integration for the Food nutrition Program run separately by two departments- Dept. of Education (DOE) and Office of Child and Family services (OCFS).

Responsibilities –

Worked with Business Analyst of OCFS in requirements gathering and prepare the functional requirements.

Worked with the system analyst of OCFS and DOE to have a better understanding of business process.

Analyzed the pre-project transactional database of OCFS department and presented the reports for the same.

Reversed engineered the transactional database and generated the data model.

Analyzed the business requirement documentation prepared by business analyst and improvise them into the current data model.

Worked closely with Data Modelling team to create the logical model for the EDW with approximately 75 entities and 1000 attributes using Erwin 7x.

The logical model was fully attributed till 3rd normalization and contains both current and history tables. Data model is divided in number of sub models for the ease of understanding and comprehension.

Analyzed the current DOE transactional database and generated the report for the DOE data model.

Prepared the feasibility analysis report for the two models and presented to the Team.

Reviewed the logical model with application developers, DBAs and testing team to provide information about the data model and business requirements.

Worked on Informatica developer to create source to target mappings (S2T).

Create Informatica workflows as per data migration planning.

Worked closely with Data Model and DBA team to provide correct and required staging area, graphs and process as per requirement.

Environment: Erwin 7x, Toad, Oracle 11g, Informatica Developer, MS Visio, MS Office.

VERIZON, VA (Aug 2010 – Feb 2012)

Ab Initio /ETL Developer

Order provisioning System

Responsibilities –

Worked with Business Analysts team in requirements gathering and in preparing functional specifications.

Conducted and participated JAD sessions with the Project managers, Business Analysis Team, Finance and Development teams to gather, analyze and document the Business and reporting requirements.

Analyze the Business information requirements and Reverse Engineer OLTP source systems to identify the measures, dimensions and facts required for the reports.

Coordinated with the Business Analyst and designed Logical and Physical Data-models as per the requirements.

Enhanced and implemented the ODS as a solution to challenges coming in regarding reporting requirements.

Worked with Model Manager and multiple data marts, involving multiple Subject Areas simultaneously, domains and Naming Standards.

Coordinated with DBA on data base build and table normalizations and de-normalizations

Prepared documentation for all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, and glossary evolve and change during the project.

Designed and developed the metadata repository plan and also designed Metadata mapping from source to target.

Improved Query performance by using TK PROF Utilities Identified and streamlined complex queries which were causing iterations and effecting database and system performance.

Identified database inconsistencies like orphan objects, duplicate objects and coordinating with various teams to decide on the appropriate action to be taken.

Explained and guided Front end Developing team on the database architecture and built objects for effectively accessing the database.

Have experience in implementing and managing the overall master data management strategy toolset.

Performed Data Validation and Unit Testing before handing over the data for production and Quality Testing.

Environment: Erwin 7x, Oracle 11g, Teradata, Ab Initio GDE 1.5, MS Visio 2010, MS Office 2010, Netezza.

COURT OF INDIANA, IN (Aug 2008 – May 2010)

Data Developer / Analyst

Migration of data from Legacy Data sources to Oracle consists of large volume of data associated with Victims and defendants case information. Keep track of records, old cases (Since 1930 to present) and also came up with a strategy to maintain data efficiently for upcoming new cases from different departments of Crime.


Interacted with users, application architects, & enterprise architects for gathering the requirements.

Designed/captured the business processes & then mapped them to the conceptual data model.

Involved in making screen designs, Use Cases and ER diagrams for the project using ERWIN and Visio.

Analyzed and worked on user and technical documentation for individual processes

Prepared Logical and Physical Data-models as per the requirements involving multiple subject areas, domains and naming standards.

Prepared documentation for all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, and glossary evolve and change during the project.

Developed the data model using ERWIN 4 & printed reports to support the data dictionary.

Developed triggers, stored procedures, functions and packages using cursors and ref cursor concepts associated with the project using Pl/SQL

Developed the performance tuning of the database by using EXPLAIN PLAN, TKPROF utilities and also debugging the SQL code.

Coordinated in database changes needed for the front end development team, explained and guided them on the database architecture and objects for effectively accessing the database.

Designed the Mapping doc for ETL team and guiding them through transformations.

Deploying and scheduling the jobs by using Autosys after discussing with Team Lead.

Worked as production Support for tuning the queries and also for any issues or defect rising or any ticket raised for production support against DB Team.

Environment: Erwin 4, Oracle 10g, Lotus Approach, Autosys, Informatica, MS Visio 2207, MS Office 2007


PL/SQL Developer


Performed physical database design like join indexes, primary indexes, secondary indexes, collect stats and data skew

Handled the tasks of monitoring system performance through real- time alerts

Established system performance metrics and defined mechanism to track performance on an ongoing basis

Responsible for communicating to users and management on developed performance metrics

Determined environment set up for applications from development through production implementation

Assigned the tasks of generating capacity plans for upcoming projects

Prepared, oversaw and coordinated database and application designs

Worked closely with system users to make sure that the information system strategies meets with the program requirements

Environment: Oracle 10g, Toad, MS Visio 2207, MS Office 2007

Contact this candidate