Post Job Free

Resume

Sign in

Manager Data

Location:
Ashburn, VA
Posted:
March 30, 2020

Contact this candidate

Resume:

VENKATA RAGHAVENDRA VAISHNAV VINJAMURI

***** ********* ***** **** *****: adcjgs@r.postjobfree.com

Ashburn Virginia 20148 Phone: 571-***-****

Summary:

Seven years of work experience in Data Warehousing, Data integration and conversion processes, including Software Architecture, Design and Development.

Data Scientist with 2+ years of experience executing data-driven solutions to increase efficiency, accuracy, and utility of internal data processing.

Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.

Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center/Pentaho/Datastage as ETL tool on Oracle, Teradata, SQL Server.

Developed Multiple real time sales/ orders tracking dashboards using Grafana, Kibana, Tableau for executives which are critical for business.

Developed monitoring and notification tools using Python

Strong understanding and hands on experience in working with kibana 4.1.2 to generate customer facing dashboards.

Experience in Amazon AWS Cloud technologies.

Extensive ETL experience in Installation, Configuration, Upgrading, Migration and Administration of Pentaho, Informatica & IBM DataStage Server and Enterprise Editions.

Solaris/Linux server administration – debugging and resolving system/application issues.

Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.

Experience in writing Oracle store procedures, java utilities and application, java script code, Couchbase and kibana nosql solutions to meet business requirements.

Experience in using Aprimo Marketing Studio, worked on creating campaigns, segmentations, inbound/outbound forms, Analyze SFDC and Aprimo MS campaign data.

Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.

Self-motivated, quick learner and adaptable to any situation.

Received award during Comcast Circle of Success - 2018 for innovation in Customer Experience as part of Britebill Program.

Ability to learn and quickly adapt to the rapidly evolving cloud technologies.

Education:

Masters in Computer Science & Engineering – Texas State University, San Marcos

B.Tech in Information Technology – Jawaharlal Nehru Technological University, Hyderabad

Technical Skills:

ETL Tools: Pentaho 4.2/5.0/7.0, Informatica 9.5.1, InfoSphere DataStage 8.0/8.1/8.5/8.7

Operating Systems: Windows 98/2000/XP/7, NT 4.0, Unix (HP 11, Sun Solaris 10, AIX 6.1), Linux (Red Hat, SUSE)

Databases: Teradata 13.1/14.1, Oracle 9i/10g/11g/12c, DB2, MS-Access, SQL SERVER 7.0/2000/2005/2008, MySQL.

Languages: Python, TSQL, PL/SQL, UNIX shell programming, HTML, .NET and Java

Reporting Tools: Tableau, Grafana, Kibana

Connectivity Tools: Putty, Termius, Cygwin X, Reflection, Exceed

Marketing Tools: Aprimo Marketing Studio

Cloud Technologies AWS

Monitoring Tools Log Stash, Cloud watch, Elasticsearch

Professional Experience:

Comcast Cable Communications, Dulles, VA Dec 2015 – Present

Software Development Engineer (ETL Developer)

Responsibilities:

Developing the transformations and jobs by following the prescribed BRD's, verified the Result sets according to the specifications.

Developed ETL process using Pentaho PDI to extract the data from legacy system.

Extensively involved in ETL code in order to meet requirements for extract, cleansing, transformation and loading of data from source to target data structures.

Develop/Deploy/Manage/Maintain Pentaho jobs writing Oracle store procedures, java utilities and application, java script code, Couchbase and Kibana NoSql solutions to meet business requirements.

Tuning the jobs for better performance at transformations level, and database level performed unit testing.

Migrated Turnstile (Commissioning tool) application from legacy Oracle Stored proc’s to Pentaho based ETL Solution.

Developed shell scripts to automate code deployment of ETL and DB elements.

Involved in the design of auditing tables for the job status for checking the job status the incremental loading.

Strong understanding and hands on experience in working with kibana 4.1.2 to generate customer facing dashboards.

Utilized web scraping techniques to extract and organize data using Python.

Developed monitoring and notification tools using Python

Developed various slack bots for alerts monitoring using Python.

Built performance monitoring dashboards using Tableau, Grafana, Kibana

Worked on fine tuning the SQL scripts as part of the performance tuning.

Developed multiple dashboards for executives to track real time sales data using Grafana.

Played key role in developing solution for NextGen Billing platform – Britebill.

Received award during Circle of Success - 2018 for innovation in Customer Solution as part of Britebill Program.

Environment: Python, Pentaho Data Integration 4.2/5.0/7, Oracle 11g/12c, UC4 Unix, Linux, Putty, Shell Scripting, DevOps, Couchbase, Kibana NoSql, JavaScript, XML, SOAP, JSON, Toad, Github.

Cognizant Technology Solutions May 2013 – Dec 2015

ETL DW Administrator/Developer, Production Support

Responsibilities:

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Designed and customized data models for Data warehouse, supporting data from multiple sources on real time

Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

Extensively involved in ETL code using Informatica tool in order to meet requirements for extract, cleansing, transformation and loading of data from source to target data structures.

Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Informatica Designer to create complex mappings from Business requirements.

Used existing ETL standards to develop mappings.

Prepared migration document to move the mappings from development to testing and then to production repositories.

Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.

Implemented shell scripts to automate code deployment of ETL and DB elements which is currently in use by SP Indices team.

Actively involved in encrypting passwords within all existing SP Indices process as per latest IOSCO regulations.

Developed packages, stored procedure within Oracle database to monitor the status of Informatica workflows and built reports on top of it.

Extensively involved in coding TPT scripts to perform the data copy & built a framework around it for automating the data copy.

Develop BTEQ scripts to perform loading through Teradata.

Involved in performing DataStage upgrade from 7.5.3 to 8.7. Installed, Configured and Administered DataStage 8.7 on six servers.

Worked closely with IBM Technical support to resolve many server level issues faced during DS 8.7 installation.

Used DataStage/Informatica Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.

Used DataStage/Informatica Designer/Manager/Director(Monitor) to design and run jobs to implement the Import and Export Interfaces.

Involved in performing Teradata Admin activities, TPT installation.

Worked as Hadoop Production Support primary point of contact.

Have built multiple campaigns, segmentations using Aprimo Marketing Studio.

Experience in troubleshooting issues occurred during integrating Aprimo and SFDC data.

Experience in Couchbase.

Have even taken up responsibilities of Configuration Management to check if ongoing projects are following Enterprise Warehouse standards set by Verizon.

Environment: Informatica Power Center 9.5.1, Workflow Designer,Workflow Manager, Workflow Monitor, InfoSphere DataStage 8.7/7.5.3, MapR, Big Data, Hadoop, Teradata 13.1/14/1, Oracle 11g, MySQL, Unix, Linux, Erwin, Putty, Shell Scripting, Toad, Tortoise SVN, XML, Aprimo Marketing Studio.

Kyro4 Solutions July 2012 – May 2013

ETL Dev/Admin

Responsibilities:

Installed, Configured and Administered IBM IS 8.5 suite.

Selected Installation Topology with layers for the products of Information Server.

Prepared disk, file and network resources. Modified kernel parameters and user limits.

Added, deleted and moved projects and jobs. Purged job log files and traced server activity.

Issued DataStage engine commands from Administrator. Started and stopped DataStage engine and managed logs.

Migrated DataStage and QualityStage jobs from earlier versions to DataStage 8.5 version.

Installed WAS 6.0.2.11 and DB2 9.1 metadata repository automatically and configured LDAP user registry with Microsoft Active Directory in both WAS & IIS.

Environment: DataStage 8.5/7.5.3, Windows XP, SQL 2000/2005 server, Version Control, Oracle 9i, SQL Server 2000, Shell Scripts and Unix

Arcbridge, Herndon, VA Jan 2012 – July 2012

ETL Admin

Responsibilities:

Installed, Configured and Administered DataStage, QualityStage and Metadata Server of InfoSphere Information Server 8.1 suite in Dev, UAT and Prod Phases.

Unlocked jobs. Set up Configuration Files, Job Parameters, Environment Variables and Data Connections across jobs in a project.

DataStage disk usage – projects, system files, datasets & scratch.

Used DataStage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.

Used DataStage Designer/Manager/Director to design and run jobs to implement the Import and Export Interfaces.

Developed Shell scripts to backup and restore DataStage projects automatically.

Environment: InfoSphere DataStage 8.5/8.1 (Designer, Director, Administrator), ERwin, Oracle 9i/10g, SQL Server 2005, SQL, PL/SQL, Unix, Linux, Windows X

Kyro4 Solutions Aug 2011 – Dec 2011

ETL Dev/Admin

Creation of migration plan, installed both and 7.x and 8.5 on same box and used multiclient manager to switch between them, remove locks and clean/delete old hash files, monitor logs, stop-start Websphere Engine/asb nodes/DSEngine/, create routines/functions in basic or C++.

Setting user/projects and ETL best practice standards, performing client presentations and maintaining corporate visibility, producing technical design documents.

Duplication of the Dev/Test/Prod environment in IIS 8.5, monitor memory/space usage and interface with Oracle DBA's/Unix Admins for smooth operation of DWBI initiative.

Create Unix shell scripts in ksh for calling dsjobs/sequences, crons jobs for scheduling, and some other ksh scripts on an adhoc basis.

Environment Information Server (DataStage 8.5)Parallel Extender/Qualitystage/Information Analyzer, UNIX AIX 6.1(64 bit), Teradata, Oracle 11, PL/SQL,, SQL Server 2003,Cognos.

Texas State University – San Marcos, TX Aug 2009 – July 2011

Responsibilities:

Worked as Data Base(SQL Server 2008) and Network Administrator for CIS Department

Responsible for carrying out the management of the procedure which relates to creating backup records of data which is present in the databases of the system using SQL Server 08.

Responsible to set up login credentials to authorized personnel, defines the privileges related with every authorized user, and makes sure that each workstation connected to the network.

Responsible for maintenance of Computer Information System (CIS) Dept Website www.cis.txstate.edu using GATO.

Environment: MSSQL Server Server 2008, XML, UNIX, Windows NT, C, C++, C#, VB.NET, ASP.NET 2.0/3.5, JAVA.



Contact this candidate