Post Job Free

Resume

Sign in

Sql Server Data

Location:
Cambridge, MA
Posted:
October 23, 2017

Contact this candidate

Resume:

Arshitha

Talend Developer

Email: ac2wos@r.postjobfree.com Ph: 860-***-****

Professional Summary:

8+ years of experience in the IT industry and wide range of progressive experience in providing product specifications, design, analysis, development, documentation, coding and implementation of the business technology solutions in Data warehousing applications.

Extensive experience in development and maintenance in a corporate wide ETL solution using SQL, PL/SQL, TALEND 4.x/5.x/6.x on UNIX and Windows platforms.

Strong experience with Talend tools – Data integration, big data and experience in Data Mapper, Joblets, Meta data and Talend components, jobs

Extensive experience in integration of various heterogeneous data sources definitions like SQL Server, Oracle, Flat Files, Excel files loaded data in to Data ware house and Data marts using TalendStudio.

Experiences on databases like MySQL, Oracle using RDS of AWS.

Experienced in ETL TALEND Data Fabric components and used features of Context Variables, MySQL, Oracle, Hive Database components.

Experience in scheduling tools Autosys, Control M & Job Conductor (Talend Admin Console).

Good experience with Big Data, Hadoop, HDFS, Map Reduce and Hadoop Ecosystem (Pig & Hive) technologies.

Extensively created mappings in TALEND using tMap, tJoin, tReplicate, tConvertType, tFlowMeter, tLogCatcher, tNormalize, tDenormalize, tJava, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.

Excellent Experiences on NOSQL databases like HBase and Cassandra.

Excellent understanding of Hadoop architecture, Hadoop Distributed File System and API's.

Extensive knowledge of business process and functioning of Heath Care, Manufacturing, Mortgage, Financial, Retail and Insurance sectors.

Strong skills in SQL and PL/SQL, backend programming, creating database objects like Stored Procedures, Functions, Cursors, Triggers, and Packages.

Experience in AWS S3, EC2, SNS, SQS setup, Lambda, RDS (MySQL) and Redshift cluster configuration.

Experienced in Waterfall, Agile/Scrum Development.

Good knowledge in implementing various data processing techniques using Pig and MapReduce for handling the data and formatting it as required.

Extensively use ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems like SQL Server, Oracle, DB2 and non-relational sources like XML, flat files, and mainframe Files.

Well versed in developing various database objects like packages, stored procedures, functions, triggers, tables, indexes, constraints, views in Oracle11g/10g

Hand on Experience in running Hadoop streaming jobs to process terabytes of xml format data using Flume and Kafka.

Worked in designing and developing the Logical and physical model using Data modeling tool (ERWIN).

Experienced in Code Migration, Version control, scheduling tools, Auditing, shared folders and Data Cleansing in various ETL tools.

Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.

Strong Team working spirit, relationship management and presentation skills.

Expertise in Client-Server application development using MS SQL Server […] Oracle […] PL/SQL, SQL *PLUS, TOAD and SQL*LOADER. Worked with various source systems such as Relational Sources, Flat files, XML, Mainframe COBOL and VSAM files, SAP Sources/Targets etc.

Work hands-on with integration processes for the Enterprise Data Warehouse (EDW).

Knowledge in writing, testing and implementation of the Stored Procedures, Functions and triggers using Oracle PL/SQL & T-SQL, Teradata data warehouse using BTEQ, COMPRESSION techniques, FASTEXPORT, MULTI LOAD, TPump and FASTLOAD scripts.

Technical Skills:

ETL/Middleware Tools

Talend 5.5/5.6/6.2, Informatica Power Center 9.5.1/9.1.1/8.6.1/7.1.1

Data Modeling

Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling.

Business Intelligence Tools

Business Objects 6.0, Cognos 8BI/7.0.s, Sybase, OBIEE 11g/10.1.3.x

RDBMS

Oracle 11g/10g/9i, Netezza, Teradata, Redshift, MS SQL Server 2014/2008/2005/2000, DB2, MySQL, MS Access.

Programming Skills

SQL, Oracle PL/SQL, Unix Shell Scripting, HTML, DHTML, XML, Java, .Net, Netezza.

Modeling Tool

Erwin 4.1/5.0, MS Visio.

Tools

TOAD, SQL Plus, SQL*Loader, Quality Assurance, Soap UI, Fish eye, Subversion, Share Point, IP switch user, Teradata SQL Assistant.

Operating Systems

Windows 8/7/XP/NT/2x, Unix-AIX, Sun Solaris 8.0/9.0.

Certification:

ETL Talend Certified Developer

Professional Experiences:

Voya, Windsor, CT. Oct 2016– Till Now

Role: Sr. ETL/Talend Developer

Responsibilities:

Participated in Requirement gathering, Business Analysis, User meetings and translating user inputs into ETL mapping documents.

Designed and customized data models for Data warehouse supporting data from multiple sources on real time

Involved in building the Data Ingestion architecture and Source to Target mapping to load data into Data warehouse

Extensively leveraged the Talend Big Data components (tHDFSOutput, tPigmap, tHive, tHDFSCon) for Data Ingestion and Data Curation from several heterogeneous data sources

Worked with Data mapping team to understand the source to target mapping rules.

Prepared both High level and Low level mapping documents.

Analyzed the requirements and framed the business logic and implemented it using Talend.

Involved in ETL design and documentation.

Invovled in data fabric in Cutting Edge Data Challenges with a Modern Data Architecture

Developed Talend jobs from the mapping documents and loaded the data into the warehouse.

Involved in end-to-end Testing of Talend jobs.

Analyzed and performed data integration using Talend open integration suite.

Experience in loading data into Netezza db using NZLOAD utility.

Experience in working with large data warehouses, mapping and extracting data from legacy systems, Redshift/SQL Server […] UDB databases.

Worked on the design, development and testing of Talend mappings.

Wrote complex SQL queries to take data from various sources and integrated it with Talend.

Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.

Involved in data fabric for increase agility and Time-to-Market with a Single Platform

Involved in loading the data into Netezza from legacy and flat files using Unix scripts. Worked on Performance Tuning of Netezza queries with proper understanding of joins and Distribution

Created ETL job infrastructure using Talend Open Studio.

Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow etc.

Used Database components like tMSSQLInput, tOracleOutput etc.

Worked with various File components like tFileCopy, tFileCompare, tFileExist.

Developed standards for ETL framework for the ease of reusing similar logic across the board.

Analysed requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools

Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.

Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.

Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.

Scheduling and Automation of ETL processes with scheduling tool in Autosys and TAC.

Scheduled the workflows using Shell script.

Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.

Troubleshoot database, Joblets, mappings, source, and target to find out the bottlenecks and improved the performance.

Involved rigorously in Data Cleansing and Data Validation to validate the corrupted data.

Migrated Talend mappings/Job /Joblets from Development to Test and to production environment.

Environment: Talend 6.x, XML files, DB2, Oracle 11g, Netezza 4.2, SQL server 2008, SQL, MS Excel, MS Access,

UNIX Shell Scripts, Talend Administrator Console, Cassandra, Oracle, Jira, SVN, Quality Center, and Agile

Methodology, TOAD, Autosys.

AT&T, Middletown, NJ Apr 2014– Sep 2016

Role: Sr. Talend / ETL Developer

Responsibilities:

Worked on SSAS in creating data sources, data source views, named queries, calculated columns, cubes, dimensions, roles and deploying of analysis services projects.

SSAS Cube Analysis using MS-Excel and PowerPivot.

Implemented SQL Server Analysis Services (SSAS) OLAP Cubes with Dimensional Data Modeling Star and Snow Flakes Schema.

Developed standards for ETL framework for the ease of reusing similar logic across the board.

Analyse requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools.

Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes and constraints.

Developed complex Talend ETL jobs to migrate the data from flat files to database.

Implemented custom error handling in Talend jobs and also worked on different methods of logging.

Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.

Exposure of ETL methodology for supporting Data Extraction, Transformation and Loading process in a corporate-wide ETL solution using Talend Open Source for Data Integration 5.6.

worked on real time Big Data Integration projects leveraging Talend Data integration components.

Analyzed and performed data integration using Talend open integration suite.

Wrote complex SQL queries to inject data from various sources and integrated it with Talend.

Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.

Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.

Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.

Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.

Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.

Scheduling and Automation of ETL processes with scheduling tool in Autosys and TAC.

Scheduled the workflows using Shell script.

Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more)

Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.

Developed stored procedure to automate the testing process to ease QA efforts and reduced the test timelines for data comparison on tables.

Automated SFTP process by exchanging SSH keys between UNIX servers.

Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.

Involved in production n deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.

Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.

Environment: Talend 5.x,5.6, XML files, DB2, Oracle 11g, SQL server 2008, SQL, MS Excel, MS Access, UNIX Shell Scripts, TOAD, Autosys.

Master Card, St.Louis, Missouri Oct 2013 - Mar2014

Role: Sr. Talend / ETL Developer

Responsibilities:

Developed Mappings for Dimensions like CUSTOMER, PRODUCT, DECILES, CALL_PLAN, CALL_FILE,TIME OFF TERRITORY.

Incorporate Retail Products into Product Dimension.

Build Customer dimension from Multiple Sources.

Created mappings with slowly changing Dimension (SCD) Type 1 and Type 2.

Created Workflow design document for all the process flows from Source to staging and Staging to DDS(DETAIL DATA STORAGE) database.

Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.

Developed and Implemented PL/SQL and UNIX shell scripts through command task to handle requirements like dropping and building indexes, flat file archiving/dropping process.

Performed tuning by eliminating source and target database bottlenecks, mapping bottlenecks and fine-tuned the pipeline logic and transformation settings.

Extensively used Informatica debugger to figure out the problems in mappings. Also involved in troubleshooting existing ETL bugs.

Used Autosys for scheduling and checking the status of Informatica jobs.

Participated in requirement analysis, and develop and maintain dimensionally modeled Analytics database for effective reporting.

Successfully deployed Talend Admin Centre WAR file in Weblogic 10.3 server and gave training to client about Talend Admin Centre functionality.

Extract, cleanse, Load test and maintain identified conformed dimensional data using a complex mix of Talend ETL components, Java and SQL

Used different components in Talend like tMap, tmssqlinput, tmssqloutput, tFiledelimited, tfileoutputdelimited,tmssqloutputbulkexec, tUnique, tFlowToIterate, tIntervalmatch, tLogCatcher, tFlowmetercatcher, tFilelist, tAggregate, tSort, tMDMInput, tMDMOutput, tFilterRow.

Schedule the Talend jobs with Talend Admin Console, setting up best practices and migration strategy.

Did a POC on the ELT component and increased the performance of one JOB.

Tool setup, Client and server model was set up by me.

Worked on TIS(EE),TOS(4.1) and TIS Mpx.

Coordinated with offshore to have smooth deliverables.

Environment: Talend Integration Services (5.0), Netezza DB, MySQL DB, Unix, Linux, Wiki, SVN,Talend Administrative Console (TAC), Java, Infosphere Datastage (9.1) Designer/Director, Erwin Data Modeler R7, Informatica Power Center 8.6.1, Oracle 10g, PL/SQL, Toad, Teradata 13, Teradata SQL Assistant, Unix Shell scripts, Windows XP, Autosys

Sonata Software, Hyderabad, India Jan 2012 - Sep2013

Role: MS SQL Server Developer (SSIS/SSRS/DBA)

Responsibilities:

Involved in installation and Configuration of SQL server 2005.

Involved in developing logical and physical modeling of the database using Erwin.

Highly used T-SQL for developing complex Stored Procedures, Triggers and User Defined Functions, Views, Indexes.

Used SQL Profiler, System Monitor, Query Analyzer for Troubleshooting, Monitoring, Optimization and Tuning of SQL Server.

Created Indexes and Rebuild indexes for better Query performance.

Migrated DTS 2000 packages to SQL Server Integration Services (SSIS) packages and also modified the packages accordingly using the new features of SQL Server Integration Services.

Created SSIS packages using various Transformations for export data from different data sources and

Transformed data and load it to SQL Server 2005.

Implemented error handling and roll back process in SSIS Packages.

Used SSIS package Configuration, changing the variables dynamically and SSIS logging.

Provided Security for SSIS packages and also used protection levels in SSIS packages.

Responsible for Deploying, Scheduling jobs, Alerting and Maintaining SSIS packages.

Created and Configured OLAP Cubes (Star Schema and Snowflake Schema) Using SQL Server Analysis Services.

Migrated and recreated existing Dimensions and cubes using Star scheme on SQL Server to achieve the efficiency of SQL Server Analysis (SSAS).

Built MDX queries for Analysis Services & Reporting Services.

Responsible for gathering/refining requirements from the customer for developing reports.

Responsible for Full Report cycle including Authoring, Managing, Security and Generation of reports.

Developed Query for generating drill down, drill through, parameterized, cascaded reports in SSRS 2005.

Backing up, Restoring system & other databases as per requirements, and also scheduled those backups.

Managing the security of the servers, creating the new logins and users, changing the roles of the users.

Worked on database security, Replication and Log shipping activities.

Environment: MS SQL 2000/2005, Windows NT/2000/XP, SSIS, SSRS, SSAS, MS Access, MS Excel, VB.Net, Visio

2007, DTS, Crystal Reports.

Tecfinics, India. July 2008– 2012

Role: SQL Server Developer

Responsibilities:

Monitoring Log shipping/Replication and trouble shooting of errors.

Created Linked Servers between SQL server 2000 & Oracle 9i.

Wrote complex stored procedures, User Defined Functions, Triggers using T-SQL.

Created DTS packages for data transfer between the two environments.

Security issues related to logins, database users, and application roles and linked servers.

Performance tuning of SQL queries and stored procedures using SQL profiler and Index tuning advisor.

Administered of all SQL server database objects, logins, users and permissions in each registered server.

Resolved any deadlocks issues with Databases/Servers on a real-time basis.

Wrote scripts for generating Daily Backup Report, verifying completion of all routine backups, log space utilization monitoring etc.

Backup and Restoration of data/databases using third party tool (SQL LITE Speed).

Involved in Design and Development of Disaster Recovery Plan.

Created reports using Crystal Reports.

Environment: SQL Server 2000 Enterprise Edition, Windows 2000/NT, UNIX, Excel, SQL Profile, Replication, DTS, MS Access, T-SQL, Crystal Reports.



Contact this candidate