Resume

Sign in

Data Developer

Location:
Fremont, CA
Salary:
$60/hr
Posted:
May 15, 2020

Contact this candidate

Resume:

Teja

Email: adc8qh@r.postjobfree.com PH: 928-***-****

Sr Talend Developer

Professional Summary:

7+ years of experience in the IT industry and wide range of progressive experience in providing product specifications, design, analysis, development, documentation, coding and implementation of the business technology solutions in Data warehousing applications.

Extensive experience in development and maintenance in a corporate wide ETL solution using SQL, PL/SQL, TALEND 4.x/5.x/6.x/7.x on UNIX and Windows platforms.

Strong experience with Talend tools – Data integration, big data and experience in Data Mapper, Joblets, Meta data and Talend components, jobs

Extensive experience in integration of various heterogeneous data sources definitions like SQL Server, Oracle, Flat Files, Excel files loaded data into Data warehouse and Data marts using Talend Studio.

Experiences on databases like MySQL, Oracle using RDS of AWS.

Experienced in ETL TALEND Data Fabric components and used features of Context Variables, MySQL, Oracle, Hive Database components.

Experience in scheduling tools Autosys, Control M & Job Conductor (Talend Admin Console).

Experienced in creating mappings in TALEND using tMap, tJoin, tReplicate, tConvert Type, tFlow Meter, tLog Catcher, tNormalize, tDenormalize, tJava, tAggregateRow, tWarn, tLogCatcher, etc.

Excellent Experiences on NOSQL databases like HBase and Cassandra.

Well versed with Talend Big Data, Hadoop, Hive, and used Talend Big Data Components like tHDFSinput, tHDFSOutput, tHiveLoad, tHiveinput, tHbaseinput, tHbaseOutput, tSqoopImport, tSqoopExport, tPigLoad, tPigFilterRow, tPigFilterColumn, and tPigStoreResult.

Created Talend ETL jobs to receive attachment files from pop e-mail using tPop, tFilelist and tFileinputMail and then loaded data from attachments into database and achieved the files.

Experienced in Waterfall, Agile/Scrum Development.

Experience on Data Modelling techniques like Data Modelling-Dimensional/Star Schema and Snowflake modelling, Slowly Changing Dimensions (SCD Type 1, Type 2, Type 3).

Hands-on proficiency in one or more scripting languages (e.g., Java, Python, Shell scripting)

Have experience in developing data ingestion jobs in tools such as Talend to acquire, stage, and aggregate data in technologies like Hive, Spark, HDFS, and Greenplum.

Worked in designing and developing the Logical and physical model using Data modeling tool (ERWIN).

Created mappings using Lookup, Aggregator, joiner, Expression, Filter, Router, Update Strategy and Normalizer Transformations. Developed reusable Transformation and Mapplets.

Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.

Experienced in Talend Service Oriented Web Services using SOAP, Rest and XML/HTTP technologies using Talend ESB Components.

Work hands-on with integration processes for the Enterprise Data Warehouse (EDW).

Technical Skills:

ETL/Middleware Tools

Talend 5.5/5.6/6.2, Informatica Power Center 9.5.1/9.1.1/8.6.1/7.1.1

Data Modeling

Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling.

Business Intelligence Tools

Business Objects 6.0, BI/7.0, Sybase, OBIEE 11g/10.1.3.x

RDBMS

Oracle 11g/10g/9i, Netezza, Teradata, Redshift, MS SQL Server 2014/2008/2005/2000, DB2, MySQL, MS Access.

Programming Skills

SQL, Oracle PL/SQL, Unix Shell Scripting, HTML, DHTML, XML, Java, .Net, Netezza.

Modeling Tool

Erwin 4.1/5.0, MS Visio.

Tools

TOAD, SQL Plus, SQL*Loader, Quality Assurance, Soap UI, Fish eye, Subversion, Share Point, IP switch user, Teradata SQL Assistant.

Operating Systems

Windows 10/8/7/XP/NT/2x, Unix-AIX, Sun Solaris 8.0/9.0.

PROFESSIONAL EXPERIENCE

Sr Talend Developer

American Express, Phoenix Az April 2019 to Present

Responsibilities:

Worked in the Data Integration Team to perform data and application integration with a goal of moving high volume data more effectively, efficiently and with high performance to assist in business-critical projects.

Deployed and scheduled Talend jobs in Administration console and monitoring the execution

Created separate branches with in the Talend repository for Development, Production and Deployment.

Excellent knowledge with Talend Administrative console, Talend installation, using Context and global map variables in Talend.

Review requirements to help build valid and appropriate DQ rules and implement DQ Rules using Talend DI jobs.

Create cross-platform Talend DI jobs to read data from multiple sources like Hive, Hana, Teradata, DB2, Oracle, ActiveMQ.

Create Talend Jobs for data comparison between tables across different databases, identify and report discrepancies to the respective teams.

Talend Administrative tasks like - Upgrades, create and manage user profiles and projects, manage access, monitoring, setup TAC notification.

Created Generic and Repository schemas.

Performed Data Manipulations using various Talend Components like tMap. tjavarow, tjava, tOracleRow, tOracle Input, tOracle Output, tMS SQL Input and many more.

Implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvert Type, tSort Row, tReplace, tAggregateRow, tUnite etc.

Created standard and best practices for Talend ETL components and jobs.

Extraction, transformation and loading of data from various file formats like .csv, .xls, .txt and various delimited formats using Talend Open Studio.

Worked on HIVE QL to get the data from hive database.

Responsible for developing data pipeline with Amazon AWS to extract the data from weblogs and store in HDFS

Executed Hive queries on Parquet tables stored in Hive to perform data analysis to meet the business requirements.

Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.

Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query performance.

Configure Talend Administration Center (TAC) for scheduling and deployment.

Create and schedule Execution Plans - to create Job Flows

Worked with production support in finalizing scheduling of workflows and database scripts using AutoSys.

Environment: Talend 6.2.1/6.0.1, Talend Open Studio Big Data/DQ/DI, Talend Administrator Console, Oracle 11g, Teradata V 14.0, Hive, HANA, PL/SQL, XML, JAVA. ERwin 7, UNIX Shell Scripting.

Talend Developer

BHP Billiton, Houston, TX Jan 2017 to Mar 2019

Responsibilities:

Worked on SSAS in creating data sources, data source views, named queries, calculated columns, cubes, dimensions, roles and deploying of analysis services projects.

SSAS Cube Analysis using MS-Excel and PowerPivot.

Implemented SQL Server Analysis Services (SSAS) OLAP Cubes with Dimensional Data Modeling Star and Snowflakes Schema.

Developed standards for ETL framework for the ease of reusing similar logic across the board.

Analyse requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools.

Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes and constraints.

Developed complex Talend ETL jobs to migrate the data from flat files to database.

Implemented custom error handling in Talend jobs and worked on different methods of logging.

Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.

worked on real time Big Data Integration projects leveraging Talend Data integration components.

Analyzed and performed data integration using Talend open integration suite.

Wrote complex SQL queries to inject data from various sources and integrated it with Talend.

Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.

Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.

Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.

Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.

Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.

Created Data Catalog using Informatica Metadata Manager, PowerCenter.

Scheduling and Automation of ETL processes with scheduling tool in Autosys and TAC.

Scheduled the workflows using Shell script.

Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more)

Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.

Developed stored procedure to automate the testing process to ease QA efforts and reduced the test timelines for data comparison on tables.

Automated SFTP process by exchanging SSH keys between UNIX servers.

Involved in production n deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.

Environment: Talend 5.x,5.6, XML files, Oracle 11g, AWS EMP,S3, SQL, MS Excel, MS Access, UNIX Shell Scripts, TOAD, Autosys.

.

Client: Schlumberger, Houston, TX Oct 2015 to Dec 2016 Role: Talend / ETL Developer

Responsibilities:

Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.

Experienced in fixing errors by using debug mode of Talend.

Created complex mappings using tHashOutput, tMap, tHashInput, tDenormalize, tUniqueRow. tPivot To Columns Delimited, tNormalize etc.

Schedule the Talend jobs with Talend Admin Console, setting up best practices and migration strategy.

Used components like tJoin, tMap, tFilterRow, tAggregateRow, tSortRow, Target Connections and Source Connections.

Mapping source files and generating Target files in multiple formats like XML, Excel, CSV etc.

Transform the data and reports retrieved from various sources and generating derived fields.

Reviewed the design and requirements documents with architects and business analysts to finalize the design.

Created WSDL data services using Talend ESB.

Created Rest Services using tRESTRequest and tRESTResponse components.

Used tESBConsumer component to call a method from invoked Web Service.

Implemented few java functionalities using tJava and tJavaFlex components.

Developed shell scripts, PL/SQL procedures for creating/dropping of table and indexes of performance.

Attending the technical review meetings.

Implemented Star Schema for De-normalizing data for faster data retrieval for Online Systems.

Involved in unit testing and system testing and preparing Unit Test Plan (UTP) and System Test Plan (STP) documents.

Identifies data sources, develop and maintain data architectures, constructs data catalog and data decomposition diagrams, provide data flow diagrams and documents the process

Responsible for monitoring all the jobs that are scheduled, running completed and failed. Involved in debugging the jobs that failed using debugger to validate the jobs and gain troubleshooting information about data and error conditions.

Environment: Talend 5.1, Oracle 11g, DB2, Sybase, MS Excel, MS Access, TOAD, SQL, UNIX, M.

SQL/BI Developer

Global Logic Technologies Pvt. Ltd - Hyderabad, Telangana May 2013 to June 2015

Responsibilities:

Responsible for designing and developing of mappings, mapplets, sessions and work flows for load the data from source to target database using Informatica Power Center and tuned mappings for improving performance.

Created database objects like views, indexes, user defined functions, triggers and stored procedures.

Involved in ETL process from development to testing and production environments.

Extracted date from various sources like Flat files, Oracle and loaded it into Target systems using Informatica 7.x.

Developed mappings using various transformations like update strategy, lookup, stored procedure, router, joiner, sequence generator and expression transformation.

Developed PL/SQL triggers and master tables for automatic creation of primary keys.

Used Informatica Power Center Workflow Manager to create sessions, batches to run with the logic embedded in the mappings.

Tuned mappings and SQL queries for better performance and efficiency.

Automated existing ETL operations using Autosys.

Created & Ran shell scripts in UNIX environment.

Created and ran the Workflows using Workflow manager in Informatica Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.

Created tables and partitions in database Oracle.

Environment: Informatica Power Center 8.x, Oracle, SQL developer, MS Access, PL/SQL, UNIX Shell Scripting, SQL Server 2005, Windows XP.



Contact this candidate