Resume

Sign in

Data Developer

Location:
Rocklin, California, United States
Posted:
May 16, 2018

Contact this candidate

Vivek Dandugula

ac5hid@r.postjobfree.com

M: (513) ***-****

Professional Summary:

Over 8 years of career reflecting experience and high performance in Developing, Testing and Maintaining applications in Data Warehouse using ETL tools like Informatica 10.1.0 /9.x/8.x, IBM Websphere DataStage 11.5/9.1/8.5/8.0.1, Ascential DataStage 7.5/7.0/6.0 and Client-Server application.

Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.

Excellent knowledge and experience in Data Warehouse development life cycle, dimensional modeling, repository management and administration, implementation of Star, Snowflake schemas and slowly changing dimensions.

Expertise in designing and developing various DataStage Server jobs using Parallel Extender version as well as Server side and Mainframe side DataStage versions.

Experience in creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.

Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).

Tuned mappings using Power Center-Designer and used different logic to provide maximum efficiency and performance.

Experienced in UNIX work environment, file transfers, job scheduling and error handling.

Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica.

Involved at all stages of the data warehousing project; setting up the schema and architecture, developing reports and Documents, security and deploying to end users.

Designed E3 Framework on UNIX environment to utilize reusability of Data stage jobs (sequence/parallel) and integrated the control M scheduling setup using Framework.

Applied Client-Server database skills (Oracle, DB2-UDB, TERADATA, NETEZZA and Informix) in mixed OS environments (including UNIX).

Experience in working reporting tools Business Objects, Tableau, Cognos and salesforce.com.

Expert in working with Autosys, Tivoli and Control M Scheduling applications.

Expertise in Performance Tuning using best practices of avoiding network connections, distributing session load by using MS SQL servers, optimizing overall session performance.

Expertise in handling high volumes of data on daily basis and loading millions of data into data warehouse in each load window.

Expertise in writing complex SQL queries and PL/SQL, including the use of stored procedures, functions and triggers to implement business rules and validations in the situation of Oracle 11g/10g/9i/8i.Delivered project needs on time and within the agreed acceptance criteria in a hybrid methodology environment as they attempted to transition to an Agile Methodology.

Organized and facilitated Agile and Scrum meetings, which included Sprint Planning, Daily Scrums or Standups, Sprint Check-In, Sprint Review & Retrospective.

Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.

Education Qualifications:

Bachelors in Computer Science and Engineering,

JNTU, India.

Technical Skills:

ETL Tools : DataStage 11.5/9.1/8.5/8.0.1, Ascential DataStage 7.5/7.0/6.0, Talend,

Informatica 10.1.0/9.x/8.x

Reporting Tools : Business Objects, Tableau, COGNOS and salesforce.com

Databases : Oracle 11g/10g/9i/8i/, DB2 UDB v9/ v8, SQL Server 2005/2000,

DB2 UDB, NETTEZA, Teradata R12.

Operating Systems : Unix, IBM-AIX 5.2, Linux, Windows 8/7/XP.

Programming Languages : SQL, PL/SQL, C, C++, C#, Perl, .XML,Net, Java, COBOL,APEX Code,

T-SQL, Shell Scripting.

Scheduling Tools : Tivoli, Control M and Autosys

Kaiser Permanente, Pleasanton, CA 06/2017-Present

Sr ETL Datastage Developer

Kaiser Permanente is an integrated managed care organization, is the largest health care organization in the United States. The Health Plan and Hospitals operate under state and federal non-profit tax status, while the Medical Groups operate as for-profit partnerships or professional corporations in their respective regions.

Responsibilities:

Involved in understanding of business processes and coordinated with business analysts to get specific user requirements.

Studied the existing data sources with a view to know whether they support the required reporting and generated change data capture request.

Used Quality Stage to check the data quality of the source system prior to ETL process.

Worked closely with DBA's to develop dimensional model using Erwin and created the physical model using Forward Engineering.

Worked with DataStage Administrator for creating projects, defining the hierarchy of users and their access.

Defined granularity, aggregation and partition required at target database.

Involved in creating specifications for ETL processes, finalized requirements and prepared specification document.

Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the SQL Server database.

Imported table/file definitions into the DataStage repository.

Performed ETL coding using Hash file, Sequential file, Transformer, Sort, Merge, Aggregator stages compiled, debugged and tested. Extensively used stages available to redesign DataStage jobs for performing the required integration.

Extensively used DataStage Tools like InfoSphere DataStage Designer, InfoSphere DataStage Director for developing jobs and to view log files for execution errors.

Controlled jobs execution using sequencer, used notification activity to send email alerts.

Ensured that the data integration design aligns with the established information standards.

Used Aggregator stages to sum the key performance indicators used in decision support systems.

Scheduled job runs using DataStage director, and used DataStage director for debugging and testing.

Created shared containers to simplify job design.

Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.

Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, regression testing, prepared test data for testing, error handling and analysis.

Environment: DataStage 11.5/8.5, Quality Stage, Flat files, Oracle, Tivoli, UNIX.

Allergan, Inc. Irvine, CA 2010- 08/2014 and 08/2016- 06/2017

Sr ETL Datastage Developer

Allergan, Inc. operates as a multi-specialty healthcare company primarily in the United States, Europe, Latin America, and the Asia Pacific. The company discovers, develops, and commercializes pharmaceutical, biological, medical device, and over-the-counter products for the ophthalmic, neurological, medical aesthetics, medical dermatological, breast aesthetics, urological, and other specialty markets.

Responsibilities:

Implemented various strategies for Slowly Changing Dimensions using Server/PX jobs using the Frame Work approach.

Developed jobs in Parallel Edition using different stages like Transformer, Aggregator, lookup, Source dataset, external filter, Row generator, column generator, peek stages to integrate Salesforce Objects.

Used SalesForce.com to validate the objects which are loaded from SAP-BW and Teradata tables.

Used DataStage Administrator to assign privileges to user groups.

Used DataStage Transformation extender (PX) while running jobs for bulk data splitting and to pass the data into subsets to all available nodes for best job performance.

Worked extensively on all the stages such as OCI, Sequential, Aggregator, Hashed Files, Sort, Link Partitioner, Link Collector and ODBC (Server jobs).

As part of support team need to solve the issue with quick response to satisfy user raised tickets through REMEDY application.

Extensively worked with web services team and applications which are used to connect through I pad applications consuming web services using SOAP.

Extracted data from various sources like SAP-BW, XML, Oracle, Teradata, Flat Files, and loaded into a SalesForce.com.

Created, modified, deployed, optimized, and maintained Business Objects Universes using Designer and exported the universe to the Repository to make resources available to the users

After running the daily batch loads and Data base sync worked with Tableau reports for business users.

Developed Shell scripts to validate the data for source files, FTP the target files to destination, scheduling the jobs on the bases of files arrival, etc.

Environment:

IBM Infosphere DataStage 11.5/9.1/8.5/8.1.1, Salesforce.com, Remedy, XML, Autosys, ServiceNow, UNIX, Korn Shell, Business objects, Tableau, Toad, APEX, Teradata, Oracle11g/10g, SQL Server, Business Objects.

USAA, San Antonio, TX 08/2014-08/2016

Sr ETL Datastage Developer

USAA was founded in 1922 by a group of U.S. Army officers to self-insure one another when they were unable to secure auto insurance due to the perception that they were a high-risk group.[7] USAA has since expanded to offer banking and insurance services to past and present members of the Armed Forces and their immediate families.

Responsibilities:

Involved with Business users and ETL Leads from different teams to implement ETL Frame Work using Server/PX combination of jobs.

Extracted data from various sources like SAP-BW, XML,Oracle, Flat Files, and loaded into a SalesForce.com.

Involved in developing and supporting applications suing BO universe for daily High volume MicroStrategy reports of sales and products.

Developed UNIX scripts to automate the Data Load processes to the target Data warehouse using Autosys Scheduler.

Used DataStage Designer for developing various Server jobs for Extracting, Cleansing, Transforming, Integrating and Loading data into Data Warehouse.

Organized and facilitated Agile and Scrum meetings, which included Sprint Planning, Daily Scrums or Standups, Sprint Check-In, Sprint Review & Retrospective.

Coordinated with systems partners to finalize designs and formalize requirements Utilized Story. Sizing and Planning Poker techniques as needed based on the length of the backlog and priorities.

Designed Parallel jobs using various stages like Join, Remove Duplicates, Filter, Dataset, Lookup file set, Modify, Transformer and Funnel stages.

Used QualityStage stages such as Investigate, Standardize, Match and Survive for data quality and data profiling issues.

Designed E3 Framework on UNIX environment to utilize reusability of Data stage jobs (sequence/parallel) and integrated the control M scheduling setup using Framework.

Upgraded to Informatica 10.1 from Websphere DataStage 8.1 along with version upgrade.

Extensively worked in COBOL platforms such as part of the client requirements.

Used DataStage Director to Validate, Run, and Schedule and Monitor Data stage jobs.

Extensively involved in Tableau Batch Load support team monitoring the jobs and fixing the issues on the priority bases.

Wrote SQL and PL/ SQL queries for aggregation and outer joins for better performance.

Wrote shell scripts for job scheduling.

Environment:

IBM Infosphere DataStage 11.3/9.1/8.5, Informatica 10.1.0, Xml, SAP Business objects, Squirrel, APEX, Control M, Oracle11g/10g, UDB DB2, Agile, NETEZZA.

Blue Cross and Blue Shield, Richardson, Texas 1/2009-10/2010

IBM DataStage Developer

Blue Cross and Blue Shield located at Richardson, the purpose of the project was to design and develop the Casualty Data Warehouse (CDW) Project where a centralized Data Warehouse was created by integrating its Policy for Casualty providing better support to organizations decision support systems.

Responsibilities:

Performed requirements gathering and source data analysis and identified business rules for data migration and for developing data warehouse data marts.

Created High Level Design documents for extract, transform, validate and load ETL process and flow diagrams.

Created Low Level Design document for mapping the files from source to target and implementing business logic.

Received the master data and populated the Dimension Tables, Time generation, Surrogate key generation.

Extracted data from text files, using FTP Stage and loaded into different databases.

Used DataStage Designer for developing various Server jobs for Extracting, Cleansing, Transforming, Integrating and Loading data into Data Warehouse.

Designed Parallel jobs using various stages like Join, Remove Duplicates, Filter, Dataset, Lookup file set, Modify, Transformer and Funnel stages.

Used QualityStage stages such as Investigate, Standardize, Match and Survive for data quality and data profiling issues.

Created Shell Scripts that will invoke the Data Stage Server jobs passing all variables for job to execute with parameterized databases connection information.

Used Autosys to schedule the DataStage ETL batch jobs on a daily, weekly and monthly basis.

Extensively worked Unix / Linux scripting for ETL jobs schedule, FTP & audit purpose

Worked on DataStage Administrator to unlock the jobs in case a job gets locked-up, and recreated the indexes of the project.

Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions.

Used conditional formatting and conditional blocks to display different sets of data based on different conditions in Report Studio.

Worked on Performance Tuning and Troubleshooting of ETL programs.

Actively participated in weekly status meetings.

Environment:

IBM Datastage 8.1.1/8.0.1, Quality Stage(7.0), ERwin, Autosys(3.0), Oracle11g/10g, Flat Files, Mainframe DB2, Sybase, SQL Server 2008, SQL* Loader, Unix Shell Scripting,AIX UNIX, Windows 2003, Business Objects XI 3.1, TOAD.



Contact this candidate