Post Job Free
Sign in

Data Developer

Location:
Naperville, IL
Posted:
September 19, 2020

Contact this candidate

Resume:

BALA KOTALA

****.****@*****.*** 248-***-****

Summary:

●15 + years of experience in designing and maintaining End-To-End Data warehousing, ETL/Integration/Mapping as Designer, developer and support Talend/Data Stage/ Data Stage -admin activities for Data warehousing all phases of SDLC.

●4 years of experience using Talend Data Integration Suite for Talend-BigData 7.3/7.1/6.4/6.2

●12+ years of experience in end-to-end IT solution offerings in all phases of the development lifecycle involving prototyping, Requirements Gathering, Analysis, Data Profiling, Design, Development, Implementation, Acceptance Testing, coherent with Data Cleansing, Data Conversion, Performance Tuning. Strong experience with Star and Snowflake schema, Data conversions, Dimensional Data Modeling, Fact and Dimensional tables, Slowly Changing Dimensions, CDC (CDC Change Data Capture).

●12+ years of Oracle 10g/9i/8i/8.0/7.0, DB2, SQL, PL/SQL,T-SQL Microsoft SQL Server 2000/7. Possesses Experience in Integration of various data sources like Teradata, DB2 UDB, SQL Server, Oracle 10g and Netezza database and Aginty workbench. Also has Experience in IBM Info Sphere Information Server Data stage 11.3/11.5 and Data stage-Quality stage and Implementing Address rules and validations rules for the Address and Names.

●Working in Agile development process and using JIRA tracking system.

●Experience in using Talend features such as context variables, triggers, AMC tables and connectors for Database and flat files.

●Working on Design, Implementation then providing end to end solution to the developers using ETL-Talend.

●Experience working on Drools integration with Talend and Talend-MDM and IBM- info sphere Data stage MDM.

●Experience in creating AWS-lambda function to trigger Talend jobs based on events.

●Worked on Talend-Data catalog POC’s using Talend Data Stewardship and Talend Data catalog.

●Created Compound type data dictionary using Talend Data Catalog.

●Changed existing data dictionary by adding new values using, assigning and removing labels, managing comments, adding a comment on a metadata object, reviewing comments across a model, changing the importance of a comment using Talend-Data catalog.

●Having good knowledge in developing, validating, publishing, maintaining LOGICAL data models and PHYSICAL data models.

●Working closely with business and data model teams and Involved in data model design database logical table creation.

●Experience working with AWS Services, EC2 Clusters, AWS Redshift Data warehouse, Aws-S3 Storage.

●Experience working with Kafka messaging services integration of Kafka with Talend then Producing messages using tkafkaInput and consuming using zookeeper and tkafkaoutput components in Talend.

●Experience importing and exporting data from RDBMS to HDFS and vice versa using Sqoop

●Experience working Talend Integration Cloud environment (TIC) and deploy Talend jobs as Docker images on AWS.

●Build and deploy Talend jobs using Jenkins pipeline and urban code (UCD) deployment process.

●Having good understanding of Hadoop Architecture including YARN and various components such as HDFS,

Resource Manager, Node Manager, Name Node, Data Node.

●Experience working with Talend BigData, Hadoop, Hive and used Talend Bigdata components like tHdfsoutput, tHdfsInput, and tHiveLoad.

●Experience on creating mappings in Talend using tMap, tJoin, tReplicate, tParallelize, tConvertType, tFlowToIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter etc.

●Experience working on Talend reusable components like routines, Context groups, context variables to run jobs in Dec, Test and Production environments.

●Experience on processing JSON files by using ETL-Tolls such as Hierarchical stage in Data stage and using Talend then sending POST/GET micro service then receive JSON response.

●Experience in using REST API Web services reading and writing xml files by Hierarchical stage in Data stage and having experience in web applications using java/JSP/J2ee.

●Experience SQL-PLSQL, Netezza, Teradata databases technologies and implementing ETL solutions.

●Experience on Working Git hub Bitbuket repository and SVN version control.

●Experience working on S3 file existence check on S3 bucket, before storing XML files on S3 storage location.

●Experience in scheduling the jobs names and script names by using scheduling tolls- Control-M job, Crancle and Autosys scheduling tool

Client: HCSC, Chicago-IL Feb 2019 – Till date

Sr. Talend Tech lead

Project: Value Based Reimbursement

●Generating and Extracting XML files using tfileoutputxml, tfileinputxml, tExtractxmlfiled, txmlmap components then send to downstream systems.

●File existence check on FTP location then downloading Xml files using tFtp, tftech components from different source system and push files from output directory to downstream system.

●Creating configurations table then processing multiple XML files and generating the JSON files.

●End to end design, implementation, exception handling, and capture statics of Talend jobs using Talend.

●GIT branches creations and helping to team to push the Talend jobs into the GIT branches and raise a pull request to move into release branch once the code changes are approved.

●Created AMC tables/Audit tables to log errors and monitor data from each component then load stats on AMC tables.

●Developed reusable components using Joblet in Talend.

●Establishing connection to datalake systems and extracting data using Hive-NoSQL.

●Extracting data from Data lake/Hive tables then loading into Db2 tables using tHDFS/tHive and tDb2 components.

●Worked with data model team and verified logical and physical data models using best practices to ensure high data quality and reduced redundancy.

●Involved in developing best practices for standard naming conventions and coding practices to ensure consistency of data models.

●Analyze data-related system integration challenges and propose appropriate solutions.

●Demonstrable experience in developing, publishing, and maintaining all documentation for data models

●Automated process, developing Sqoop queries using shell script extracted data from DB2 and used in Zena process to kick of Talend jobs.

●Used automotive build process and perform continuous integration and continuous delivery CI/CD using Jenkins and maven plugins.

●Automated Talend jobs and Schedule jobs using Zena scheduling tool by writing Wrapper script in Linux.

Environment: Oracle11g/12c, Talend Studio 7.1/6.4, toad 11.6 WinScp, Db2, Zena scheduling, UNIX Shell Scripting and Windows Server and UNIX

Client: Grainger, Chicago-IL May 2016 – Jan 2019

Role: Sr. Talend Lead Developer

Project#1: Data stage Migration to Talend

●Prepared and design Technical specifications and Mapping documents with Transformation rules.

● Converted complex job designs to different job segments and executed through job and Sub job components for better performance and easy maintenance.

●Created complex mappings in Talend using components like tMap, tJoin, tReplicate, tParallelize, tAggregateRow, tDie, tUnique, tXMap, tFlowToIterate, tSort, tFilterRow etc. and have created various complex mappings.

●Extensively used Talend components to integrate data between various source and CRM system using tSAPBapi,tsalesforce Connection, tSalesforceInput, tSalesforceOutput,

●Developed the Talend mappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files Salesforce data and Teradata database.

●Working on the Talend ETL flow to load the data into hive tables and create the Talend jobs to load the data into Oracle and Hive tables.

●Defined a suitable architecture for data integration from multiple data sources

●Working on POC to Configure the Azure using the tAzure components on Talend.

●Worked with REST API using Talend tRestclient and processing POST/GET methods retrieve/update data in JSON file format then split attributes using tExtractjson fields then load into target.

●Storage of binary files components like .PDF, .JPG, .PNG files from Azure and loading into the Talend container.

●Having experience working on transferring the data from relational data base to Cloud such as Azure by using Talend Big data Spark Jobs.

●Working on data load into HIVE tables & HDFS files and develop the Talend jobs to integrate with Teradata system from HIVE tables.

●Designed ETL process using Talend Tool to load from... and supported the Talend jobs scheduled through Talend Admin Center (TAC).

Environment: Oracle11g/12c, Talend Studio 6.4, Talend Data catalog,Data Steward, Data stage 11.5, WinScp, Salesforce, Crancle scheduler, UNIX Shell Scripting and Windows Server and UNIX

Project#2: WCCI

Role: Sr. Data stage Lead developer

WCCI project is data migration of customer master data from SAP to MDM Where data will be cleaned and organized and to get business understanding of data related to organizations and business locations of its customers. Data extracts from the SAP system by using the Data stgae ETL tool and will apply the various business rules and load into the MDM tables. After loading into the MDM this data is being used by the sales force system to place the orders based on the customer information provided. Here we are using the four different levels of data extraction they are STGI, STGP, STGD, and STGO.

●Understanding of Data integration, Data Quality, Data architecture and Data Management, project life cycle Phases, best practices and processes.

●Creating technical design (translating business rules to technical implementation)

●Understanding of Data integration, Data Quality, Data architecture and Data Management, project life cycle Phases, best practices and processes.

●Worked on Data stage-migration 9.1 to 11.5 and experience in Data stage jobs

●Export and Import dsx files using Data stage isx tool and migrated from Data stage 9.1 to 11.5 environments.

●Working with Data stage Administration activities

●Working with Salesforce connector stage extract data from Salesforce and load into MDM tables.

●Create PMR request with IBM and follow up and installing patches provided by IBM.

●Loading xml data attributes, Loading Binary objects/Files, gryphon call recording files on Salesforce using Data stage.

●Validating data using the standardization rules of Names, Address and Phone Numbers using Quality Stage in Data stage.

●Coordinated with data model team to evaluate data models and physical databases for variances and discrepancies.

●Validate business data objects for accuracy and completeness.

●Analyze data-related system integration challenges and propose appropriate solutions.

●Creating the matching process to validate the Account and Contacts in Quality Stage in Data stage.

●Working on configurations to work on BDFS stage to connect to the Hadoop system. Extract data from Hadoop files and load into the MDM tables.

●Configuring SQL server database, Teradata connector and extract data from SQL server database.

●Creating with Shell scripting in a Unix/AIX environment to generate the reports from Audit tables

●Using version SVN version control deploys Data stage packages from Development environment to QA/Prod environment.

●Performing code reviews against other developers Data stage jobs and Oracle-SQL/PLSQL scripts.

●Performance of tuning of the Data stage job before loading into the MDM tables

Environment: Data stage 11.5, Data stage-Quality stage, Oracle11g/12c, Info sphere, MDM, toad 11.6 WinScp, Chronicle scheduler, UNIX Shell Scripting and Windows Server and UNIX.

Client: HCA, Olympia-WA, Rockville MD May 2014 –Apr 2016

Senior Data stage Developer

Washington Phase II conversion is working for social service provider’s conversion. Conversion all of the legacy system’s SSPS Authorizations is a complex endeavor for inserting into the existing P1 Authorization Subsystem. A request will be made by CNSI prior to Initial Authorization Conversion Cutover to receive final extracts and crosswalks from the Legacy. Once we receive these extracts, our goal is to cleanse the data, verify it and report to Washington State for any exception. All this data will undergo conversion with the business rules provided by State.

●Working on Health Care Projects.

●Efforts, estimates of Data Conversion projects.

●Set up different environments to run conversion process before releasing from to production.

●Automation of staging jobs by running using UNIX shell scripts and implanted staging jobs-statics for Providers, Authorizations modules by using Data stage server routines.

●Extensive knowledge of health information and health care services regulatory environment HIPAA, Medicaid/Medicare and EDI.

●Expert in generating xml files for 837P and 837I and send to downstream systems.

●Generation of Exception reports based on the legacy data and conversion data.

●Automation Tracking and Target jobs by using Unix Scripts.

●Automation of Exception reports and summary reports

●Provider addresses validations by using Data stage-Quality stage.

●Involved End to end cutover activity of all Conversions runs in different environments.

●Migration few of the modules Data stage jobs to Talend open studio.

Environment: Data stage 11.3/9.1,Oracle11g, Mainframe, toad 11.6 WinScp, Autosys, Talend open Studio Data Integration & Big Data 6.2, UNIX Shell Scripting and Windows Server and UNIX.

Emblem Health, New York City, NY Nov 2013 – May 2014

Senior Data stage developer

Project Description: EFT-ERA project main objective is to process EFT_IND filed along with all other fields from GHI/Qcare and different providers data and load into the existing/new medical and dental ODS tables/views, these data will be used in ESAS application which will be help to customer service center to resolve the customer service tickets.

●Gathering requirements and Preparing Specifications for ETL Jobs

●Coordinating with Data modeling team and Data analyst team to designing and developing New ETL jobs to process Hospital, Dental claim data

●Extracting PPO, HMO, Dental claim data from different vendors and load into staging and corresponding ODS tables.

●Extract Mainframe system source data and load into staging tables.

●Used CFF stage to extract data from the mainframe systems and Flat files and load into staging tables.

●Updated EFT_IND field across applicable to all staging and ODS tables.

●Handle high value data and loaded into the ODS environment.

●Performance tuning of the SQL queries in the existing and new ETL jobs

●Handle low values data from mainframe source system to staging tables.

●Developed Post updated jobs to update EFT_IND field into Existing ODS jobs.

●Updated existing Autosys job schedule with new developed jobs

●Provide Post Warranty Support.

Environment: Data stage 9.1, Data stage 8.1,Oracle11g, Mainframe, toad 11.6 WinScp, Autosys, UNIX Shell Scripting and Windows Server and UNIX.

#USAA, San Antonio, TX Apr 2013 – Oct-2013

Role : Senior Data stage Developer

EMPLOYER: TATA CONSULTANCY SERVICES LIMITED

Project Description: The main objective of VRM is establish the foundation for VRM dashboards to provide Cross domain view to operations management to improve the infrastructure planning Process.

●This effort will provide Hardware Aging Analytics capabilities to the technical operations

●Portfolio stakeholders to include average aging dashboard reports. Phases are Maintain, Phase out and denied.

●Responsibilities:

●Gathering requirements and Preparing Specifications for ETL Jobs.

●Designing and developing New ETL job to process Asset and Life cycle data.

●Creation Dimension and Fact tables to load Asset and Lifecycle and loaded into Netezza target table.

●Creation of extract ETL jobs from Sales force.com source system.

●Data cleanup process from the Quantum database to Service manager database.

●Design and develop the reusable component to load the data from data set to the target table.

Involved in the testing of the various jobs developed and maintaining the test log.

Involved in ETL Testing Process and Bug Fixing.

Updated existing Control-M job schedule with new developed jobs.

#GM, Detroit, MI Sep 2010 – Mar 2013

Project : ADP

Role : Sr. Data stage Developer—Module lead

EMPLOYER: TATA CONSULTANCY SERVICES LIMITED

Responsibilities:

●Preparation of BRD’s and Design development documents based on the requirements.

●Designing and developing Master controlling sequencer jobs using the Data stage Job Sequencer

●Build the code based on the requirements and develop middleware layer to provide data to ADP Global View. It is proposed to route data from all the source applications to GV through GIF. And similarly route all data from GV to downstream applications through GIF. New interfaces would be developed & deployed to facilitate this data interchange.

●Extracting, cleansing, transforming, integrating and loading data into data warehouse using Data stage Designer.

●Prepare Low Level Design (LLD) document based on the mapping document and Data Model.

●Lead for the PeopleSoft module

●Preparing and review documentation of the project.

●Lead for the testing process PeopleSoft and other ADP project modules.

●Code review of different modules and provided design and developed guidelines to the team members.

●One time migration of initial data to ADP.

●Creation of CTRL tables for frame work development and STG tables while loading data before target.

●Involved in the testing of the various jobs developed and maintaining the test log.

Migrating jobs from one environment to other.

●Creation of Unit testing strategy, Integration testing strategy according to the test plans.

●Managing the versioning of Documents and Data stage Code using iCollab.

Environment: Data stage 8.5, Oracle 10g,Winscp, UNIX shell Script and ControlM

Worked on various Data stage and Java related projects from June 2005 to Sep 2010

Client: Qantas Airways, Sydney Feb 2010 – Sep 2010

Role: Data stage Developer

Client: GM, Detroit, MI Dec 2008 – Sep 2010

Role: Data stage Developer

Client: Thomas and Reuters, UK July 2007 – Nov 2007

Role: Data stage Developer

Client: GM, Detroit, MI Jan 2007 – Jun 2007

Role: Data stage Developer

Client: MBIS GM, Detroit, MI May 2006 – Dec 2006

Role: Data stage / SQL Developer

Client: GM – OWB, Detroit, MI Nov 2005 – Apr 2006

Java / SQL Developer

Client: Local Number Portability, US June 2005 – Oct 2005

Role: Java Developer/SQL Developer

EDUCATION

Master in Computer Applications from JNTU 2004



Contact this candidate