| Distance: |
Resume alert |
Resumes 111 - 120 of 11986 |
Nandigama, Andhra Pradesh, India
... •Experience in installing, configuring Cloudera, Hortonworks clusters and installing Hadoop ecosystem components like KAFKA, oozie, flume and zookeeper. •Experience in MICROSERVICES using spring boot with spring cloud stack like pcf, cloud Netflix ...
- Apr 14
Denver, CO
... ●Understanding of functional programming such as Scala, JavaScript and Big Data technologies like Hadoop and HDFS. ●Proficient in using version control tools like Git, BitBucket and SVN Tortoise. •Worked with an offshore team for project ...
- Apr 14
Chicago, IL
... Strong experience with data platforms and big data ecosystems including S3 Data Lake, Athena, EMR, Redshift, Hadoop, Spark, and ETL tools such as Informatica, Talend, and Pentaho. Expertise in enterprise integration patterns and tools including REST ...
- Apr 14
United States
... ·Worked on Hadoop servers and Yaml Configuration to improve the application performance by providing the more possible nodes to speed up the data streaming. ·Use Spark Dash Board to identify the application job performance and monitored the threads ...
- Apr 13
United States
... ·Worked on Hadoop servers and Yaml Configuration to improve the application performance by providing the more possible nodes to speed up the data streaming. ·Use Spark Dash Board to identify the application job performance and monitored the threads ...
- Apr 13
Hyderabad, Telangana, India
... Hadoop with HIVE Other Control M, Tode with Oracle to see schemas, and for Queries Functional Management and Leadership Skills Ability For New Product Development. Training & Development Trained fresher’s in various technologies such as Informatica, ...
- Apr 13
Arlington, TX
... Big Data and Distributed Systems: Hadoop, Apache Spark, Databricks. Databases: Oracle SQL, MySQL, MS SQL. Cloud Platforms: Google Cloud Platform, Amazon Web Services, Microsoft Azure. DevOps and Automation: Docker, Podman, CI/CD Pipelines, GitLab, ...
- Apr 12
Irving, TX
... •Teradata can be integrated with Hadoop using Apache Sqoop for data transfer. Sqoop allows the importing of data from Teradata into Hadoop (HDFS, Hive) and vice versa. •Use Teradata Parallel Transporter (TPT) or Hadoop connectors to enable high ...
- Apr 12
District Heights, MD
... Designed, architected, and implemented large, cost-effective BI analytics systems using Hadoop and Green Plum. Performed design, implementation, and maintenance of complex multiple product modules/subsystems and created custom workflows. Mentors and ...
- Apr 11
Cincinnati, OH, 45220
... Technical Skills • Languages & Scripting: Python (Pandas, PySpark), SQL, Scala, Java, Bash • ETL & Orchestration: Apache Airflow, dbt, Apache NiFi, Azure Data Factory, AWS Glue • Big Data & Streaming: Apache Spark (batch & streaming), Hadoop (HDFS, ...
- Apr 10