Lead - Big Data Developer Job in Sutherland Global Services

Lead - Big Data Developer

Apply Now
Job Summary Job Description:

Sutherland is seeking an attentive and technical person to join us as Sr. Developer - BigData. The person should have hands-on experience in the Big Data tools, Hadoop, Hive, Hbase, Impala and ETL techniques and frameworks. We are a group of dynamic and driven individuals. If you are looking to build a fulfilling career and are confident you have the skills and experience to help us succeed, we want to work with you!

Responsibilities:
Keep management updated: Should update the management on the status
Impact the bottom line: Produce solid and effective strategies based on accurate and meaningful data reports and analysis and/or keen observations
Define Sutherlands reputation: Oversee and manage performance and service quality to guarantee customer satisfaction
Strengthen relationships: Establish and maintain communication with clients and/or team members; understand needs, resolve issues, and meet expectations
Take the lead: Able to debug the production issue quickly and provide the root cause analysis with resolution for the issue.



Qualifications:

To succeed in this position, you must:
This technical position will be responsible for development and on-going support of an enterprise data warehouse, data marts and supporting systems/applications.
Implementing ETL process with multiple sources systems such as SQL, Oracle, Files, Mail etc.
Develop efficient pig and hive scripts with joins on datasets using various techniques.
Assess the quality of datasets for a hadoop data lake.
Apply different HDFS formats and structure like Parquet, Avro, etc. to speed up analytics.
Monitoring performance and advising any necessary infrastructure changes
Design and implement schemas of Hive and HBase within HDFS.
Assign schemas and create Hive tables.
Fine tune hadoop applications for high performance and throughput.
Troubleshoot and debug any hadoop ecosystem run time issues.
Support QA (resolve issues and release fixes) UAT and production support when required.
Deployment and integration testing of developed components in Development and Test environments.
Years of Experience: 6+
Educational Qualification: B.E / B.Tech / M.C.A

Additional responsibilities:
Knowledge of hadoop ecosystem and its components HBase, Pig, Hive, Sqoop, Flume, Oozie, etc.
Know-how on the java essentials for hadoop.
Know-how on basic Linux administration
Knowledge of scripting languages like Python or Perl.
Good knowledge of concurrency and multi-threading concepts
Experience with Spark
Experience with integration of data from multiple data sources
Experience with Cloudera 6.0 and above versions
Analytical and problem-solving skills.
Good Communication skills

Experience Required :

Fresher

Vacancy :

2 - 4 Hires

Similar Jobs for you

See more recommended jobs