Post Job Free
Sign in

Data Analyst Analytics

Location:
Piscataway, NJ
Salary:
$55
Posted:
May 23, 2024

Contact this candidate

Resume:

Dhruvi Email: Phone: dhruvidagar**@+Dagar 1-732-***-**** gmail.com Data Analyst

PROFESSIONAL SUMMARY:

§ § § § § § § § § § § § § § § § § § § § § § TECHNICAL Over Improvement Extensive documenting deployment Experienced issues Trace documenting Understanding cleansing Strong Metadata Performed and Experience technical Excellent BI) Experience tables, Strong Import, framework Experience packages. Strong Well DTS, Extensive Manager, Experience and Strong covered Experience Experts Experienced validating, Excellent Enthusiastic Teradata. data PL\versed 9+ requirements and physical experience understanding experience experience and (by SQL SKILLS quality SME's)problems. experience working Workflow Management and ETL years experience work and data the such in in in in in in Data and with learner in Scripts to business business Data used conducting writing testing and and data analytics, analyzing test testing Cloud of and, work analysis verification on managing as of : Development Data Export data Data experience working Management tools spark cases. implementing in deployment modeling in logical IT Monitor with throughout working Services and in complex as Data design Business of processes Migration, experience Services experience requirements, Integration complete like and using Requirement data data a and analytical Joint writing Data resolutions Data with Analysis, for and Tableau data concepts with Hadoop. with in and such visualization and Application multiple SQL source Modeling solution. and Agile Big using Analyst the Server and SQL Data profiling presenting analytical Software test QA using expertise as engineer years queries and Data SDLC Data translating and AWS Methodology Data and initiating Collibra teams. (to Traceability Conversions, to i.strategies ETL problem-Manager) e.to production Technologies target @Informatica using of, PL/Analysis, process Development EC2, using data to RISK. Migration, perform Development and and tools experience in reports. perform SQL for Erwin. EMR, creating modeling)mapping. design data them modeling complex solving such Enterprise statements and using Matrices & Data Data issues. complex visualization RDS, Software (to Power data such Data as and test (abilities. functional Star the in Extraction/ JAD) SQL Profiling, Life Informatica . S3 to Equipped completeness, development as the (Cleansing, plans, Requirements Data to RTM) &find - Data Center/ on Cycle Hadoop sessions Development snow-Stored assist field various tools Governance. solutions specifications. Profiling, to Data user (with flake SDLC) Transformation/with of PowerCenter ensure Procedures, and Power and with Transformation, Business sources Migration, sessions training, expertise correctness, big schemas, Spark. techniques Traceability from Business Life for Data that data Mart) a Cycle analyzing, systems all variety Functions, with definition, tools, Data overseeing in and FACT Data (the users, data Loading Designer, (data (SDLC) Tableau technical using Matrix solve Analysis, requirements Integration, Integration, including of and mining transformation Subject gathering, business Triggers, Data processes. open-data (dimensions production (ETL) and RTM) Workflow teams and Process storage Mining, Matter source Power Oracle using Data data and and and and and are to Database Data Big Project Data Modeling management Technologies Systems Tools Snowflake, Dimensional Modeling, Physical Hadoop, Jira Atlassian, Modeling, Hive, FACT Teradata, Data HP Spark, and ALM. Erwin Modeling, Databricks. IBM Dimensions r7.DB2, 2/3/Star ORACLE, 4, Tables, Visio, Schema SQL Collibra, Conceptual, Server, Modeling, Lucid Oracle. Logical Snow-Chart. Flake Data Data Tools Warehousing Informatica, DataStage.

Cloud Programming Languages Technologies AWS Insights, Python, (EMR, R, MS-SQL, EC2, SQL Scala. Server)S3, Athena) . and Azure (Blob Storage, Azure HD Office Reporting Analytical Applications tools tools / MS Tableau, Office PowerBI, (Word, Excel, Looker, Visio, and PowerPoint, SQL Server Project) reporting services. PROFESSIONAL EXPERIENCE:

Broadridge Sr Responsibilities: § § § § § § § § § § § § § § § § § § § § § § Environment: Snowflake, Data Worked Involved visualization Performed complex mapping Assisted Wrote to Created Worked curated Created Business. Involved testing Used Data Used Work to Worked streaming Worked lake. Rationalized Created statistical Created API. Closely target Data SSIS Created Actively needed Prepare risks Test be Analyst and Analysis- Validation and Lucid done. Collibra with mapping SQL and Scripts T-to work layers. SQL detailed Data various Financial on on participated and Tableau on Business on Informatica. rules. SQL/in work SQL, information. in communicate that Jira scripts upstream chart building multiple worked detailed building extensive building quires Metadata share engineers Mapping Business with Tool Excel Data AWS, to can on Atlassian, based by Operational, data to ensure dashboards/acceptance to mitigation -Chief the be Lake project to Logic collection, document with STM’s EMR, domains Data and test end teams in a discovery used on manage, DATA Specifications, & development on near requirement Epic to Data any Success, data the created business for discussed to Python, S3, Analysis, the and leadership for status to end mappings Change aggregate Estimation, real plan. validation Maintenance Kafka, of Stewards. quality testing data the team. reports explain reporting. Data store documents datasets ETL Entity New time data reports Collibra, requirements. transformation dictionaries Data OLTP, with Control team with issues. rationalized to solutions gathering team. the for flow and York views streaming Relationship map by such Sprint Profiling, Business the to use for OLAP, data and developed Hadoop, writing for diagrams. in call data using Data as identifying cases each requirements Ad-right Planning visualization, for sales, sessions out Visio, from metadata solution and hoc Needs. snowflake several Data and building source. and SAS, diagrams from tasks Traceability inventory, data BI UNIX, Source Design submit and Cleansing, the reports Spark, data and in loading complex using and integration leads Shell impact help progress, DB. Reporting thereby to ingestion the translate team created Snowflake, traffic, Target using data Matrix product Scripting, to the intake Data SQL and test on modeling data bricks, milestones Tableau and services, & queries automated of and conducting case the Quality and the created requests owners Business using aggregation Windows. SQL Analysis results making update. Kafka requirements for the and Budget Server, different and Oct service with completed, for day-data. process Requirements sessions and and involved 2022 the Data and the the today objects and MS spark specifications data documented ETL work - presented documentation Integrity workflows Till forecast. Excel, to for analysis systems identify in available on structured that Date data source back-mapped the GitHub, needs using it data and end and any like the for for to to in MetLife Data Responsibilities: § § § § § § § § § § § § § § § § § § § § § § § § § § Environment: Server, Worked information Maintained and Created through After Performed claims Documented Used source Data SSIS Documented review for Wrote Objects Performed and Created regulatory Worked Worked with Involved dimensional Worked Worked Conducted identify Created Built Knowledge connections, Worked committed Worked Identified exceptions. reports. Performed Performed Analyst them HiveQL, Teradata. payment - and Analysis- high and sign-Informatica New complex pattern. systems. & XIR2 business requirement user with on with on frequent in Advanced with with Informatica. to track customized volume in problem off Jira, requirements. York importing Detailed data data data backlog 3-data be the Tableau, in Data to database scheduling all training the Normalization/transaction Data team from way,end cross covered Data Tableau that be VersionOne, mapping SQL the modeling analysis mining Validation, City, Source requirements discussions data. stored power Profiling, 2 Modelers users the members all work against SQL collection, areas -Data queries functional way SAS feasibility. interactive documents NJ and project environments. client at Administration on to tasks, Queries and within and of center done Analysis 9.data. mapping the within review cleansing Target 4 the Claim’s the data Collibra, Data for for on series, De-closely data later data with items data embedding for Sprint document team business validating technical the conceptual for normalization cleansing reports the to using sessions the definition, mapping profiling (stage the modelers as transformation DDA)and metadata. (are help UNIX, of to ETL) Hadoop, loan to scope a delivery business data map part closed Presto enrollments Tool generate or, terms user and brief, Data and portfolio, Shell views extraction, the for using in using documents & of from of business to dashboards Data for the AWS, logical understand the different functional or data data schedule. Quality and started (Scripting. users build techniques AWS) Excel, by addressed various next Configuration, complex project/backlogs, and Mining, against MS lineage definitions integrating analyzed modeling data requirements and transformation and phases Analysis SQL. Access, developing data with and user requirements reports sources the release, SMEs. using SQL maintain analysis. different validating, or conducted loading capture data groups, for new of risks, have MS effort on to with (enhancements. very adding DQA) optimum transformation using the like conduct Excel, various applications the to been the and participated the other kinds them to logical and complex Teradata, functional SAS and document and Data Tableau. design data suggested Visio, users, terms allocated loading sources platforms. walkthroughs of codes in Data performance analyzing using data analysis reports JIRA, Informatica, ER SQL and managing and Profiling Oracle, analysis. logic. on in by model Diagram different Apr data under systems appropriate and functionality. requirement definitions queries 9.gathering generated 4 & 2020 data from flat series. worked for validation of some on in licenses & Unix, these ETL including files, all and and source - heterogeneous Data relational Sep requirements other loan for systems solutions by Oracle, against SQL discovered if sessions presenting 2022 Model. required, and Business business data. to systems project Server Oracle meet data SQL and like the or to Humana – Louisville, Kentucky Jan 2018 – Mar 2020

Data Responsibilities: § § Worked Worked Analyst/in on Data Initial “Data Modeler Analysis Verification” root review out and and identified confirmed erroneous/results of misaligned data analysis data. to pinpoint data cleaning,

§ § § § § § § § § § § § § § § § § § § § § § § § Environment: Mainframe, ICICI Data Responsibilities: § § § § § § § § § § § § Worked the Worked Provided data Interpreted Extensively Performed Worked repository Performed Maintained process. Developed allowed information. Incorporated scripts, Tested data Performed Assisted Identify, Started Documented the Performed Tested published Created Defining stewardship Responsible Analyze Analyzed Gathered Performed business Performed Tested data Responsible Experience Create Coordinate Maintained expected Created Developed Analyst Bank, quality business and from from high the Complex Complex performed code DB2, Hyderabad, pivot performance the with end with with analyze, with roles recommend requirements. results a requirements the source source by ETL objects Jira, functional of data detailed data unit ad standard story with in Gap documentation level used data, data a for for of requirements. with users SAS ETL all business management the “Data QA predictive Enterprise tables creating hoc close and process the Hadoop, testing, loading, different Analysis, profiling ETL ETL data Analysis the for using and SQL and team data. flat flat analyze 9.telling tool SAS analyses, supporting responsibilities computations, company’s to 4 Data Cleaning” the onshore and and India Mappings process Mappings series. files files low-mapping resolution. interpret dashboards queries specifications understand and in and UNIX with relational for system and project extracting MS ran modeling 9.dashboards Analysis strong and Data Data SQL results level and and to data 4 both wrote to related as certification created Access, series. scripts VLOOKUPs check coordinators for data prioritize to Queries to business RDBMS RDBMS principal mapping needed, Warehouse and design and testing and trends loaded team. before analyzes correct analyzing working using database Business and (in the quality. related DDA)(and to Sessions Sessions transformation the for and rule Use MS Tableau/ in difficulties, integrated data tables tables documents for or and data into with statistical file requirement validation business activities information Excel, compatibility Tableau testing in Cases,, the testing engine) patterns Data data testing relationship to for products team Excel Rules system data transfer on various validation the erroneous/based based to to data daily Excel and the Azure, Quality Use target target requirements ability across and and as Desktop Document and with & techniques. to from complexities governance and in of fly integrated validate on on Case troubleshooting a databases. updates and processes / evaluate such business assets. complex client and part business information Collibra, with Power business business tables. tables. and the review Analysis of to Source multiple with file misaligned Diagrams, process. comprehend as the of Tableau and after data. the data manipulation. MS and data point teammates the testing. data the user existing using and in systems intelligence published Informatica, usage and user user (data Access systems correctness new DQA) the validation. Customer/Business needs. ensure viz. sets. for with Activity anomalies data, Test Data requirements requirements validation functional project of the and analysis system and to the quick and Director Data and Warehouse Teradata them clear key Data architecture requirement SQL Diagrams UAT Unix, Seller and requirements identify management stakeholders. that filters Quality infrastructure process. as accountability on Server. design Profiling testing. completeness. SQL, to needed. May and and may FICO to conform for erroneous/using Tableau to business business HiveQL, 2014- documents arise Tested team documents. score on test on on MS demand staff source during and to to Sep with for daily Server using the Tableau, Visio rules rules user understand misaligned to confirmed 2017 messages based data. the basis. the achieve needed needs. python to to which load load new ETL JCL, on Environment: Windows XP, UML, Teradata, SQL, MS-Quick Excel Edit, 2000, MS MS-Office office 2007, 2000, Microsoft MS Visio XP 2003, Professional. Share Point, PowerPoint, MS Project,



Contact this candidate