Post Job Free
Sign in

Building Engineer Data Processing

Location:
Jersey City, NJ
Salary:
80000
Posted:
June 13, 2025

Contact this candidate

Resume:

saisumanth@Sumanth workwebmail.com Korada Summary (929) 279-Data 1736 Engineer USA LinkedIn GitHub Experienced technologies, accessibility, development contributing Data and with building Engineer a strong to logics improved command skilled & solutions in business of building Python to drive intelligence. scalable and business SQL. data Successfully Passionate Technical growth pipelines and about implemented and innovation Skill contributing optimizing real-ETL to time cloud workflows. data data processing lake, Proficient machine solutions in learning, AWS, and Azure, storage enhanced and product data GCP

• • • • • • • • Programming DevOps Project Data Data Informatica, Cloud (Databases Hive Data Seaborn, BigQuery, Analysis Engineering Quality Platforms Management & Matplotlib CI/Compute & Databricks Data & CD & & Governance: Visualization: Scripting: Tools: & & Warehousing: Services: Engine) Big & Git, Collaboration: Data Python, GitHub, AWS Great Technologies: Power Snowflake, (Expectations, S3, Azure SQL, BI, Redshift, JIRA, Scala, Tableau, DevOps, Apache Redshift, Confluence, R, EMR, Cloud Java, SSRS, Docker, Professional Spark, EC2)BigQuery, Shell Data Looker, Agile, Jenkins, Azure Hadoop, Catalog, Methodologies Excel Azure (Data pytest, Hive, GCP (Experience advanced)Synapse, Factory, IAM, Apache Data GDPR, Modeling, Azure Synapse,, Airflow, PowerPoint, CCPA SQL, Logic CI/Kafka, MS CD, Plotly, SQL Apps, Zookeeper, ETL Server, Pipelines Pandas, Data Lake, MySQL, dbt, NumPy, Blob Talend, PostgreSQL, Storage)scikit-Alteryx, learn,, GCP Oracle, SciPy, Data Data • • • • • • • • • • • • • • • • • • Engineer, Engineer, Built and Streamlined customer Developed, future Implemented strict Designed inform Created credit Implemented resulting Integrated billing up Collaborated substantial Created exports systems. Developed down Optimized even Architected encryption Collaborated Deployed Activated Purview Performed bringing Established all changes enterprise-authorization as and permissions from ETL delinquency, feeds more data so a and cataloged anomalies in released and credit Step automated Azure Capital Cognizant that the pipeline Snowflake and data to underwent 12 business trained, stable, CI/and developed into and volumes intelligent with extensively credit CI/HIPAA, maintained scale Synapse hours Functions CD analysts, implemented quality profiles operated Databricks refreshed CD Snowflake One events credit and real-pipelines production-columns and to credit and improving data pipelines outcomes. to metadata GDPR, grew. and the encryption automated time schema a under checks reducing lending deployed stayed trial risk, centralized with security workflow in Azure credit surface dbt analyses credit notebooks real and using credit and in Azure leads, compliance, into 2 with clinical Azure ready model using discovery by current hours, decisions traced time, risk Blob CCPA runtimes data and quickly credit Snowpipe to testing partitioning the transaction GitHub to and Data dashboards meet clinical SQL releases DevOps Storage accuracy keeping operations, orchestrate governance credit pipelines standards so to data for stakeholders risk Factory and run that and for and and GDPR by Actions near-back for data data models speedy Spark-to surfaced for lineage 30% teams frequent credit finance streams staging by Education continuous deploy in real-and in pipelines raw without to warehouse stack, biostatisticians, 20% AWS using AWS so Tableau, its based in always remediation. risk CCPA time had tracing and that teams and key root TensorFlow on releases Glue and modularizing Glue AWS teams diminishing all curated decision-credit Amazon transformations with metrics requirements curated lab ingestion version-had jobs providing the to using with using Lake and or push always incremental for data up-transformation whenever and hospital data PySpark, Formation in Azure Kinesis Azure tables making reliable on to-control Certifications advanced on Power and complex usability. data date Amazon a lakes informed patients 360-Synapse Purview, table by source, and analysts EDC transforming data clinical new ADF copy BI and study degree and transformations credit AWS dashboards, clustering SageMaker JSON for scripts bureau, pipelines, established in logic Analytics, IAM facilitating whereby and ETL one reporting for Lambda view metrics flattening, roles to date, requirements place artifacts. were transaction, over pull for of Synapse to improving whenever utilization, to bringing so Azure to performance automatically such without audit recommend and clinical 5 protect that capture and million clinical decision-as readiness Active SQL migrated complex or delinquency together gathering and having 01/sensitive an report payment records credit balance code scripts, ADF lab 2024 Directory dynamic optimization, 01/making. tested, patient normalization. to data, and legacy generation card EHR pipeline 2021 and scavenge and daily – feeds credit behavior, Present transparent trends reviewed, bringing balances, credit feeds, building Databricks roles credit queries – into arrived, information, 04/consumed dramatically delivering efficiency across limits to lab Amazon master 2023 and Remote, refresh mask executed clinical payment and results, data accelerating risk notebooks and multiple deployed, new India PII files Redshift provenance. by metrics windows applying more predict marts. USA and and speeding quickly updates, 25% tables, and CRO apply so the ERP so and to that Master Computer of Science, and Information Northwest Sciences Missouri State University. 05/2023 – 06/2024 Maryville, USA Advanced sql



Contact this candidate