Spark With Scala Developer Job in Evergent Technologies

Spark With Scala Developer

Apply Now
Job Summary

Requirements
Passion in Data Engineering, Programming Technologies like Scala.

  • Algorithms and Dara Structures, Building Logics, Logical Programming.
  • Strong Object-oriented programming and design skills, preferably in Scala. programming with Spark /Akka.
  • Excellent analytical and problem-solving skills, oral and written communication skills.
  • Experience with Spark or other distributed computing environments. Experience building scalable, reliable, distributed Linux systems with Big Data processing technologies.
  • Experience designing and implementing data ingestion and transformation for big data platforms. (Spark, Kafka, Cassandra.).
  • Proven track record designing highly parallelized data ingestion and transformation jobs in Spark including Spark Streaming.
  • Production experience working with Apache Spark clusters.
  • Should be familiar with concepts like Time Complexity, Distributed Computing Hands-on with Big Data Development technologies in an environment like 10TB-20TB.

Experience Required :

3 to 6 Years

Vacancy :

2 - 4 Hires

Similar Jobs for you

See more recommended jobs