Post Job Free

Resume

Sign in

Data Developer

Location:
Germantown, MD
Posted:
April 09, 2021

Contact this candidate

Resume:

Babu Pusulury

adlj92@r.postjobfree.com

703-***-****

Education and Training

•IBM Information Server Solution Developer

•MicroStrategy Certified Report Developer

•MicroStrategy Certified Report Analyst

•TDWI Business Intelligence Certification

•MapR Certified Hadoop Administrator

•Tableau Administrator

•MicroStrategy Certified Administrator

•MS Information Systems, George Mason University, VA

Synopsis of Experience

•Performed multiple roles as Senior ETL Developer, Information Server Administrator, MicroStrategy Administrator with tasks involving end-user communication, development of specifications, ETL design, development, testing, installation and administration.

•Knowledge of Informatica, IBM Infosphere Information Server, Autosys, MicroStrategy and MongoDB.

•Experience working with multi-terabyte data marts in the Healthcare, Insurance, Government and Financial services.

•Expertise in multi tool functionality and performed large scale data warehousing projects.

•Expertise in providing solutions for enterprise data integration projects.

•Provided on call support to incidents. Coordinated production incidents with multiple teams for quick and effective resolution of issues.

•Multi year experience in ITIL service delivery model.

Work Experience

PPS Infotech Mar 2016 – Mar 2021

Sr. ETL Developer and Operations Lead

•Designed, administered and coordinated development with offshore team.

•Wholly responsible for fixing Mantis tickets and deployment to Production

•Handed multiple projects including Customer analytics, Data fixes and PDI.

•Responsible for delivering on time and on budget solutions to the client.

•Coordinated testing efforts with the offshore team.

•Utilized Pushdown optimization for efficiently loading of data into the Teradata databases.

•Implemented Type 2 logic using Mload to insert and Temp tables to Update data on the target

•Created Bteq scripts for creating tables, inserting and updating data.

•Familiar with Tpump for low volume data loads.

•Worked extensively with Teradata macros, analytical functions and SQL performance tuning.

•Implemented Table cloning to replicate data and add new columns to existing tables.

•Implemented indexes and collect stats on tables for efficient query performance.

•Acted as the sole point of contact for client business team for fixing issues and communicating completion of project tasks.

•Coordinated and fixed production issues by supporting the developed application.

•Acted as the subject matter expert on ETL solutions to other development teams.

•Conducted training sessions to educate people on architecture and deployment planning.

•Created a data provisioning solution for Producer and Customer data integration projects and to develop MDM solutions using Siperian.

•Delivered complex ETL – Teradata solutions to meet clients business needs.

•Performed performance tuning on the code for efficiency between the informatica and teradata components.

Environment: Informatica 9, Teradata V2R5, Oracle 11g and Cron, CVS

Informatica Developer

Hospital Corporation of America

•Role description: Responsible for designing the architecture for the client. Coordinated with business to understand the requirements and to deliver an optimized environment for development, testing and production.

•Performed analysis to identify issues and resolved them.

•Performed a thorough analysis of the existing code to understand the process, identified issues and resolved them. Redesigned existing code by using shared folders, mapplets and reusable sessions.

•Developed scripts to automate administration tasks and scheduled existing workflow runs.

•Developed ETL Architecture specification documents.

•Worked extensively with push down optimization and bteq scripts to load data into the target tables.

•Utilized UNIX shell scripts to run workflows and email users on errors. Maintained a log of the issues and troubleshooted them.

•Performed repository backups and recovery at regular intervals. Tuned the repository for optimum performance. Performed cache tuning and utilized persistent cache for enhanced performance.

•Implemented velocity guidelines for performance tuning.

•Set up informatica connection objects for various sources.

•Performed code migration from DEV to QA and QA to Prod using deployment groups and xml files. Troubleshooted issues regarding object conflicts during migration.

•Utilized Version control for change management. Standardized code maintenance and check out procedures.

•Peer reviewed job design for efficient development and data load.

•Utilized teradata analytical functions in extracting data.

Environment: Informatica 8.6, Teradata V2R5, Oracle 10g, Tidal

Informatica Developer

American Health ways

•Responsible for designing architectures for business-critical applications. Experience in full life cycle implementations.

•Conducted business requirements meetings with subject matter experts, stake holders and project managers to design solutions. Conducted JAD sessions with development team and System architects to develop solutions.

•Performed data profiling to capture data variances and exceptions. Developed mitigation strategies to handle data anomalies. Created ETL effort estimation document to find the timelines and budgeting requirements.

•Developed Enterprise ETL development standards. Developed generic mappings using SCD type 2 logic to keep track of the history of customers. Coordinated the ETL team effort by training, developing code templates and reviewing code of the team members. Implemented enhancements to ETL code by suggesting best practices. Developed ETL technical specification document templates.

•Developed UNIX shell scripts to run the mappings, report errors and email users. Designed the audits and controls and business validations framework to track data loads and to catch data errors. Brought in Oracle streams technology to ease development efforts and maintain efficiency. Developed framework to report on data quality for daily loads.

•Developed Code migration strategy for Development, Testing and Production environments.

•Coordinated Unit, System Integration and UAT testing efforts. Kept track of bug fixes for timely delivery of solutions.

•Implemented change management by utilizing available tools and recommending suitable practices.

•Automated data load, extract and file delivery procedures to prevent manual intervention. Provided first level operational support for development and production systems.

•Fine-tuned the mappings for optimum performance.

•Identified bottlenecks in data processing and resolved them.

•Coordinated troubleshooting efforts with Informatica support by raising tickets and tracking them.

•Utilized Admin console to track user connections, client connections, check logs and manage alerts.

Environment: Informatica 8.6, SQL Server 2005, Oracle 11g, Tidal, Trillium, IDQ

Claraview (Teradata). Mar 2011 – Mar 2016 Integration Developer

Project Risk – PCAOB

•Performed data analysis and captured data quality issues.

•Designed and developed 2 stage load process for error cleansing and data loads.

•Designed and developed batch status table to track job runs.

•Standardized jobs for data feeds from Audit analytics, GMI and Inspection.

•Modified Informatica mappings to implement changed logic.

•Impact: The enterprise metadata error tracking module helped the client to identify metadata errors in the data feeds and report them to the originating organizations. Cleansing jobs helped in maintaining data integrity and prevent loading of this data into the data mart.

Environment: Informatica 8.6, Dataflux 7, SQL Server 2005, Oracle 10g

Integration Architect

JP Morgan Chase

•Developed Audit strategy for Mainframe feeds to check data consistency of the source files.

•Developed framework to process data into the datawarehouse.

•Built archival and maintenance strategy for the source files which will be used for reprocessing and error correction.

•Designed and developed file migration process utilizing UNIX shell scripts (Cygwin) on a windows server. Integrated Error checking module into the mappings to check for the accuracy of the source data.

•Automated delivery of files to HQ for timely delivery of data

•Impact: The Audit process helped the client to see the errors in their Mainframe file extraction process. Inaddition it helped to see whether the record counts and aggregated totals are consistent across feeds. Archival and maintenance of the files has been automated through shell scripts which delete files over 2 months old. The FTP process to move the files enabled the client to automate the entire data load process with minimal intervention across different locations and platforms. Error checking module enabled the client to catch errors in the source data and take appropriate action to correct and reprocess them.

Environment: Informatica 8.6, SQL Server 2005, Oracle 10g

ETL Developer

No Child Left Behind Initiative – Dept of Education

•Implemented the data warehouse solution for the largest federal data provider in the country.

•Performed installation and configuration of Datastage Server on UNIX AIX server. Set connections for various sources and wrote shell scripts for file maintenance.

•Coordinated the efforts with Data Architect to model the warehouse and implement database tuning.

•Developed ETL mappings to extract data from Oracle and SQL server sources. Effectively coordinated team effort to attain set targets.

•Created templates for ETL specification and Unit testing documents. Performed unit testing and migration of code to testing environment.

•Developed Report specification documents for report developers.

•Impact: The Dept of Education was able to perform analysis from data collected from all over the nation. The Enterprise ETL - Reporting architecture was established and standardized for the client.

Environment: Datastage 8, SQL Server 2005, Oracle 10g, Erwin

ETL Developer

PacifiCare

•Independently undertook the design, development and implementation of the integration process.

•Created audit and error management modules as per the Sarbanes-Oxley requirement. Integrated the new code with existing modules of the parent company. Designed and developed the processes for extracting, cleansing, transforming, integrating, and loading using DataStage.

•Developed UNIX, Autosys scripts for scheduling the jobs. Implemented the interface run log to keep track of the number times an interface is run and what were the results.

•Undertook testing the interface to satisfy error and audit requirements through development, SIT, UAT, STAGE phases of the project.

•Created conceptual design, technical design, interface migration execution procedure, unit testing and peer review checklist documents for the interface.

•Communicated the errors in data to the business group for correction and reprocessing. Worked with the business and technical staff to design the interface as per the requirements.

Environment: Datastage 8, Oracle 10, Autosys



Contact this candidate