Hadoop Scala Developer
Posted 2 days 7 hours ago by Belhati
Belhati is one of the competitive consulting companies, headquartered in United Kingdom that provide cutting-edge technology solutions, that enable businesses to thrive in the digital era. We specialize in Cloud, DevOps, Data Science, Blockchain, IoT, and Metaverse technologies, and our team of experts is dedicated to delivering innovative solutions that drive business growth and transformation.
Our team of experts has years of experience in above technologies. They are highly skilled, use the latest tools and technologies to design, develop, and implement solutions that transform businesses and drive innovation.
What will your job look like
- 4+ years of relevant experience in Hadoop with Scala Development
- Its mandatory that the candidate should have handled more than 2 projects in the above framework using Scala.
- Should have 4+ years of relevant experience in handling end to end Big Data technology.
- Meeting with the development team to assess the company's big data infrastructure.
- Designing and coding Hadoop applications to analyze data collections.
- Creating data processing frameworks.
- Extracting data and isolating data clusters.
- Testing scripts and analyzing results.
- Troubleshooting application bugs.
- Maintaining the security of company data.
- Training staff on application use
- Good project management and communication skills.
- Designing, creating, and maintaining Scala-based applications
- Participating in all architectural development tasks related to the application.
- Writing code in accordance with the app requirements
- Performing software analysis
- Working as a member of a software development team to ensure that the program meets standards
- Application testing and debugging
- Making suggestions for enhancements to application procedures and infrastructure.
- Collaborating with cross-functional team
- 12+ years of hands-on experience in variety of platform & data development roles
- 5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture
- High Proficiency working with Hadoop platform including Spark/Scala, Kafka, SparkSQL, HBase, Impala, Hive and HDFS in multi-tenant environments
- Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data quality processes and tools
- Experience in full lifecycle architecture guidance
- Advanced analytical thinking and problem-solving skills
- Advanced knowledge of application, data and infrastructure architecture disciplines
- Understanding of architecture and design across all systems
- Demonstrated and strong experience in a software engineering role, including the design, development and operation of distributed, fault-tolerant applications with attention to security, scalability, performance, availability and optimization
Requirements
- 4+ years of hands-on experience in designing, building and supporting Hadoop Applications using Spark, Scala, Sqoop and Hive.
- Strong knowledge of working with large data sets and high capacity big data processing platform.
- Strong experience in Unix and Shell scripting.
- Experience using Source Code and Version Control Systems like Bitbucket, Git
- Experience working in an agile environment
- Strong communication skills both verbal and written and strong relationship and collaborative skills and organizational skills with the ability to work as a member of matrix based diverse and geographically distributed project team.
Are you legally authorized to work in the country of this job? Yes No
Are you willing to relocate for future career changes? Yes No
If offered a position, would you be willing to submit to a criminal background check, a previous employment verification check, and/or an education verification check? Yes No
This position is a relocation position, are you willing to relocate? Yes No
Upload Resume
(Supported pdf, doc, docx types)