Post Job Free

Resume

Sign in

Software Development Life Cycle

Location:
Hartford, CT
Posted:
February 27, 2024

Contact this candidate

Resume:

SRIKANTH

Email: ad3yn3@r.postjobfree.com

Mobile : +1-774-***-****

Professional Summary

Having 11 Years of IT Experience in Development, Design, Maintenance, Enhancements, Analysis and Testing as ETL Talend DI, Talend Cloud Expertise, Talend Admin, Snowflake and OBIEE Analytics Developer and Admin and PowerBI.

Experience in Analysis, Design, Development, Testing, and Implementation (all phases of software development life cycle - SDLC (Software Development Life Cycle).

Hands-on experience in ETL design and development.

Experience in Planning, tracking, and managing of Agile software development projects using tools such as JIRA.

Experience of data warehousing concepts, including data modelling and star/snowflake schema design.

Experience on create New Labels and Execution plan in TAC and scheduled daily run the jobs in the plan using triggers as per the business requirement.

Experienced on different schedules like File CRON trigger, Simple trigger, and File trigger in TAC and TMC

Experienced in implementing Talend jobs using various components.

Worked on context variable and groups defined context for source and target database connection, SMTP connection and User credentials for easily migrating to different environments in a project.

Experience with managing and configuring cloud services like AWS, Azure, or Google Cloud

Strong experience in performance tuning of ETL processes using Talend.

Experienced Talend Upgrade from 6.3 Prod to 7.2 QA and Prod.

Experienced Publishing the Talend jobs to nexus repository in the Studio.

Experienced in Code Deployments from Dev/QA/Prod for Various Applications

Experienced in data ingestion with inbound and outbound using SFTP.

Installation, configuration, and upgrades of Talend applications on all existing and supported versions.

Security configuration for users, projects, roles (GIT, TAC, working with security group)

Experienced License Update in Talend, TAC and CLI.

Experienced configure new service Account using LDAP, GIT and Projects in TAC.

Experience in administering Linux servers, managing user accounts, system updates, monitoring performance and troubleshooting issues.

Experience managing and maintaining Windows servers, applying updates/patches, handling Active Directory, resolving issues.

Excellent troubleshooting and problem-solving skills must be able consistently identify critical elements, variables, and alternatives to develop solutions.

Experienced in Talend Platform Management, Platform Support, ITSM tools and ITIL.

Experience in scripting languages such as Bash, Python, or PowerShell and ability to automate routine tasks and troubleshoot.

Monitoring the Talend JobServer and TAC server, check weather server is active/in-active.

Experienced with snowflake cloud data warehouse and AWS S3 bucket for integration data from multiple source system which include loading Nested JSON formatted data into snowflake table.

Experienced using snowflake clone and time travel.

Experienced role, Privileges required to access different database objects.

Define virtual warehouse sizing for snowflake for different types of workload.

Experienced in creating pipelines, Mapping Dataflow, Linked services, datasets, and triggers using Azure.

Experienced in creating Azure data factory pipelines for loading data from different sources and writing to ADLS.

Experienced in creating triggers to run pipelines on schedule basis.

Experienced in creating Notebooks in Azure Data bricks using Pyspark code for data Extraction.

Experienced in creating pipelines, Linked services, datasets, and triggers.

Good experience in developing application using Agile Methodology.

Experience to Design and implement datasets for report solutions using reporting services, PowerBI or other data visualization tools.

Design and model datasets with powerBI desktop based on measure and dimension requested by customer and dashboard needs.

Create paginated reports and other reports using powerBI reports builder for on-premises and cloud data sources.

Experience to publishing PowerBI reports and dashboards in powerBI services.

Experience to create row level security on dataset and reports with powerBI desktop and integration with PowerBI service portal.

Extensively used ETL Informatica tool to extract data stored in MS SQL 2000, Excel, and Flat files and finally loaded into a single Data Warehouse.

Good experience in Business Model development with Dimensional Hierarchies, Aggregation Navigation, Time Series functions and Level based Measures.

Experience in OBIEE 11g installation, Configuring Repository, Create/Manage Dashboards, Reports/Answers and building Physical, Business Model and Presentation Layer.

Proficient in Implementing Security model like LDAP, SSO and SSL in OBIEE 11G.

Implemented Object level Security and Data level Security as per the user’s requests.

Education:

Master Of Computer Application from JNTU University, Hyderabad 2010.

Technical Skills

Data Warehousing & ETL Tools

Talend – DI, Talend Big Data, Talend with Cloud, Talend Administrator Center (TAC), TMC, Snowflake, ADF, Informatica

Programing Languages

Java, PL/SQL, SQL, Python, C, C++

Operating System

Dos, Win98/2000/NT, UNIX, Linux

Repositories

GIT HUB, SVN

CICD

Jenkins, Bit Bucket

Data Modeling

Erwin, Toad

Languages

SQL, T-SQL, ANSI-SQL, MySQL, PL/SQL, UNIX Shell Script, Visual Basic, ANSI SQL, SQL*Plus 3.3/8.0, JAVA, Python Scripts,

Databases

Oracle 11i, SQL, PostgreSQL, SQL Server, MySQL, Hive, Snowflake, AWS Redshift

Reporting Tools

OBIEE, PowerBI, Tableau

Scheduling Tool

TAC, Crontab, TMC and Control M

Cloud Technologies

Azure Blob Storage, Azure Data Bricks, Hive, AWS S3, GCP

Work Experience:

GE (Senior ETL/Talend Developer) Aug 2021 – July-2023

Description:

GE Power, part of GE Vernova, is a world energy leader that provides technology, solutions and services across the entire energy value chain from the point of generation to consumption. Powering more than a third of the world, it serves customers in more than 150 countries. They want to load the data into cloud data warehouse for this we are using Talend and snowflake ETL to extract and loading into cloud Data warehouse.

Environment: Talend DI and Big Data Integration, Talend DI/ESB (7.2), SQL Server, Snowflake, My SQL, Hive, Oracle, TAC Scheduling tool, BIG Data, Java, python, PostgreSQL, ADF, Databricks

Location: Hyderabad, India

Responsibilities:

Experience on Leads development activities for a functional area, module, or project. Provides and documents technical design and specifications meeting business and functional requirements.

Developed ETL mappings for various sources(.txt,.csv,.xml) and load the data from these sources into relational tables with Talend.

Design, develop, debug, test and promote Java/ETL code into various environments from DEV through to PROD.

Implemented Talend jobs using components in Talend like (tmap,tdie,tconverttype,tlogcatcher,trowgenerator, tfilelist, tjava, tlogrow, toracleinput,toracleoutput, tsendemail, etc).

Experience on create New Labels and Execution plan in TAC and scheduled daily run the jobs in the plan using triggers as per the business requirement.

Experience in Analysis, Design, Development, Testing, and Implementation (all phases of software development life cycle - SDLC (Software Development Life Cycle)

Experience in Planning, tracking and managing of Agile software development projects using tools such as JIRA

Hands-on experience in ETL design and development

Strong experience in performance tuning of ETL processes using Talend.

Experience in scripting languages such as Bash, Python, or PowerShell and ability to automate routine tasks and troubleshoot.

Experience in setting up ETL/Talend jobs and supporting other teams in ETL related setups and Ability to support other INFA/Talend developers in development and configurations.

Experience of data warehousing concepts, including data modelling and star/snowflake schema design.

Experience in managing and optimizing Talend Data Integration environments including job design, routine maintenance and troubleshooting

Experienced on different schedules like File CRON trigger, Simple trigger and File trigger in TAC and TMC

Experienced Talend Upgrade from 6.3 Prod to 7.2 QA and Prod.

Experienced Publishing the Talend jobs to nexus repository in the Studio.

Experienced in Code Deployments from Dev/QA/Prod for Various Applications

Security configuration for users, projects, roles (GIT, TAC, working with security group)

Experienced License Update in Talend, TAC and CLI.

Experienced configure New service Account using LDAP, GIT and Projects in TAC.

Monitoring the Talend JobServer and TAC server, check weather server is active/in-active.

Experience with managing and configuring cloud services like AWS, Azure or Google Cloud.

Experienced with snowflake cloud data warehouse and AWS S3 bucket for integration data from multiple source system which include loading Nested JSON formatted data into snowflake table.

Experienced using snowflake clone and time travel.

Experienced role, Privileges required to access different database objects.

Experienced in creating pipelines, Mapping Dataflow, Linked services, datasets and triggers using Azure.

Experienced in creating Azure data factory pipelines for loading data from different sources and writing to ADLS.Experienced in creating triggers to run pipelines on schedule basis.

Experienced in creating Notebooks in Azure Data bricks using Pyspark code for data Extraction.

Excellent interpersonal and communication skill to interact with various stakeholders and translate technical terms into easy-to-understand concepts.

Excellent troubleshooting and problem-solving skills must be able consistently identify critical elements, variables, and alternatives to develop solutions.

Troubleshoot and resolve issues related to connectivity, and performance bottlenecks.

Baker Hughes- Talend Developer Oct 2019– July 2021

Description:

Baker Hughes Company is an American international industrial service company and one of the world's largest oil field services companies. The company provides the oil and gas industry with products and services for oil drilling, formation evaluation, completion, production, and reservoir consulting. The company was originally known as Baker Hughes Incorporated until 2017 when it was merged with GE Oil and Gas to become Baker Hughes, a GE Company (BHGE).They want to migrate that data into Big Data warehouse for this we are using Talend ETL to extract and loading into Data warehouse.

Environment: Talend DI and Big Data Integration, Talend DI/ESB (7.2), SQL Server, Snowflake, My SQL, Hive, TAC Scheduling tool, BIG Data, Java, python, Postgresql

Location: Bangalore, India

Responsibilities:

Experienced to implemented Talend jobs using various components as per the business Requirement.

Experience in ETL design and development.

Experience of data warehousing concepts, including data modelling and star/snowflake schema design.

Experience in managing and optimizing Talend Data Integration environments including job design, routine maintenance, and troubleshooting.

Experience in Planning, tracking, and managing of Agile software development projects using tools such as JIRA.

Strong experience in performance tuning of ETL processes using Talend.

Experienced Publishing the Talend jobs to nexus repository in the Studio.

Experienced Publishing the Talend jobs from GIT to Nexus repository through TAC.

Experienced in Creating new Labels and Execution plan in TAC and Daily scheduled to run the jobs in the plan using triggers as per the business requirement.

Experienced on schedules like CRON trigger in TAC and TMC

Monitoring Daily and weekly Jobs and Execution Plans as per the schedules in TAC.

Experienced in Code Deployments from Dev/QA/Prod for Various Applications.

Experienced in creating Azure data factory pipelines for loading data from different sources and writing to ADLS.

Experienced in creating triggers to run pipelines on schedule basis.

Experienced in creating Notebooks in Azure Data bricks using Pyspark code for data Extraction.

Replicating OLTP data into snowflake using ADF pipelines and transforming the data inside snowflake for OLAP.

Creating ADF pipelines to extract latest files from Azure data lake storage, Generated new files from extraction list.

Troubleshoot and resolve issues related to connectivity, and performance bottlenecks.

TE Nov 2018 – Oct 2019

ETL/Talend Admin

Description:

TE Connectivity Ltd, formerly Tyco Electronics Ltd, designs and manufactures connectivity and sensor solutions for a variety of industries including automotive, industrial equipment, data communication systems, aerospace, defense, medical, oil and gas, consumer electronics, energy and subsea communications. They want to migrate that data into Data warehouse for this we are using Talend ETL to extract and loading into Data warehouse.

Environment: Talend with Big Data Integration, TAC, Oracle, Linux, UNIX, GIT, SVN, Nexus, LDAP, CLI, Putty, WinSCP

Location: Bangalore, India

Responsibilities:

Security configuration for users, projects, roles (GIT, TAC, working with security group).

Installation, configuration, and upgrades of Talend applications on all existing and supported versions.

Implemented Generic script for deletion of Out of disk Space issue in Talend JobServer and TAC Server.

Experienced Tomcat upgrade in all Env Dev/QA/Prod

Experienced Creating users and projects and Roles in Talend Administration console.

Experienced provide the project level access to users.

Experienced configure new service Account using LDAP, GIT and Projects in TAC.

Experienced License Update in Talend, TAC and CLI

Monitoring the Talend JobServer and TAC server, check weather server is active/in-active.

Experienced in Talend Platform Management, Production Support, ITSM tools and ITIL.

Experience in Talend platform management & administration.

Experience managing and maintaining Windows servers, applying updates/patches, handling Active Directory, resolving issues.

Experience in administering Linux servers, managing user accounts, system updates, monitoring performance and troubleshooting issues.

DELL-EMC (Talend Developer) Oct 2016 – Oct 2018

Description:

DELL is one of biggest client headquarters is located in Round Rock, Texas and main they sells sells Laptops, servers, data storage devices, network switches, software, computer peripherals and provide the service under warranty. In delta project mainly we Integrate data from different source systems applying transformations and load data into oracle data warehouse as per Business Requirement. The objective of the project is to design and build a data warehouse to take various business-oriented decisions in order to develop the business.

Environment: Talend DI, TAC, GIT, SVN, Nexus, PostgreSql,OBIEE Administration Tool,Oracle Enterprise Manager,Administration Console,WebLogic, Putty,WinSCP,Unix,Linux,SQL, OBIEE, Oracle

Location: Bangalore, India

Responsibilities:

Implemented Talend jobs using various components as per the business logic.

Worked on context variable and groups defined context for source and target database connection.in different environments in a project.

Experienced Publishing the Talend jobs to nexus repository in the Studio.

Experienced Publishing the Talend jobs from GIT to Nexus repository through TAC.

Experienced in Creating new Labels and Execution plan in TAC and Daily scheduled to run the jobs in the plan using triggers as per the business requirement.

Experienced on different schedules like File CRON trigger, Simple trigger and File trigger in TAC

Monitoring Daily and weekly Jobs and Execution Plans as per the schedules in TAC.

Experienced in Code Deployments from Dev/QA/Prod for Various Applications.

Experience in performance tuning of ETL processes using Talend.

Experienced in Unit testing for Data validation to maintain Quality of work.

Implemented in modifications of physical, business and presentation layers of metadata repository using OBIEE Administration Tool.

Implemented prompts to provide dynamic filter condition to end users.

Create the reports with Table, Chart, Pivot views and publish them in dashboard using Answers.

Experienced to implement the Security model like LDAP, SSO and SSL in OBIEE 11G.

Implemented Used Session variables, Application Roles, Application policies in Security implementation.

Amgen (BI Analytics Developer) Nov 2015 – Oct 2016

OBIEE Administration Tool, Oracle Enterprise Manager, Administration Console, Putty, WinSCP, UNIX, Linux, SQL, OBIEE, Oracle

Location: India

Description:

Amgen is the world’s largest Bio-Pharmaceuticals manufacturing services company. EDW is an application that integrates data sources, delivers reporting, and provides analysis capability to support various business units. Finance functional areas, Purchase Order, Operations, and Human Resources/HCM. EDW provides past and current HR & Finance information to business users with Oracle as primary source and target. Extensively involved in providing End to end delivery for the reporting part of the project.

Responsibilities:

Design the RPD in Administration tool using Physical Layer, Business Model and mapping Layer and Presentation layer of a Repository by using Star Schemas.

Created Dimension Hierarchies (Ragged, Skipped, and Parent-child) and Level-Based Measures.

Used Session Variables, Application Roles, and Application Policies in Security implementation.

Implemented Time Series Wizard, Rename Wizard and modelled Time Series data to display year ago information.

Designed and Customized OBI Answer Table Views, Chart, Pivot, Narrative, Create and Save Filters in Answers.

Responsible for Support, Bug fixing & Weekly Releases.

Responsible for OBIEE deployments (UAT/GO-LIVE/Production).

Experienced in manually migrating the code (rpd changes and web catalog objects) from one environment to other.

Implemented cache Management (Cache Purging).

ITC (Business Intelligence Developer) July 2014 – Oct 2015

Environment: OBIEE Administration Tool, Administration console, Oracle Enterprise Manager, SQL, OBIEE, Oracle

Location: Kolkata, India

Description:

ITC is one of India's foremost private sector companies, ITC has a diversified presence in Cigarettes, Packaged Foods, Personal Care, Stationery, Safety Matches, Confectionery, many other FMCG products and Hotels. OBIEE is implemented to provide field staff the ability to take better & informed decisions. Reports such as market & outlet coverage, Retail Audit, Lakshya, Loyalty, Promo Effectiveness, and Turnover Class by Pop Group etc increase the efficiency of the employees. Reports are drillable with graphs & charts.

Responsibilities:

Based on Business Requirement Design and Implement of OBIEE Repository (Physical, Business and Presentation layer)

Created the schema in Physical and BMM Layer.

Implemented Dimensional Hierarchies & Level based measures in BMM layer based on requirements.

Created the measures according to business requirement.

Experienced in creating the Repository variables, Session variables & initialization blocks.

Worked on Security both Data level Security, Object Level Security and Cache management.

Experienced in Performance Tuning of reports by identifying the indexes required on the backend tables and also from the data available in the Cache.

Configured Agents using Delivers, Job Manager to deliver Analytics content through mail, phone to business users based on schedule.

Developed different reports in Answers like views & charts, pivot and Narrative tabular used global and local filters.

Generated the reports through answers and published them in the Dashboards.

Experienced in Unit testing of the reports and Data validation.

GE (BI Developer) Dec 2012 –Mar 2014

Environment: OBIEE Administration Tool, Administration console, Oracle Enterprise Manager, SQL, OBIEE, Oracle

Location: Noida, India

Description:

GE Healthcare is an American multinational company in New York and headquartered in Chicago, Illinois. As of 2017, the company is a manufacturer and distributor of diagnostic imaging agents and radiopharmaceuticals for imaging modalities that are used in medical imaging procedures. GE Healthcare operates as a subsidiary of General Electric. And developing the different types of view based on reports using BI Answers. Extensively involved in providing End to end delivery for the reporting part of the project.

Responsibilities:

Experienced in modifications of physical, business and presentation layers of metadata repository using OBIEE Administration Tool.

Implemented prompts to provide dynamic filter condition to end users.

Configured Repository variables to generate dynamic reports.

Create the reports with Table, Chart, Pivot views and publish them in dashboard using Answers.

Used Session variables, Application Roles, Application policies in Security implementation.

Experienced in design, analysis, development, testing and deployment of reports in OBIEE.

Experienced in Performance Tuning of reports by identifying the indexes required on the backend tables and also from the data available in the Cache.

Implemented all the Ad-Hoc reports as a part of requirement from the client side.

Created groups and users and have implemented the access strategy to various reports and dashboards.

Experienced in Unit testing of the reports and Data validations.



Contact this candidate