Belhati is one of the competitive consulting companies, headquartered in United Kingdom that provide cutting-edge technology solutions, that enable businesses to thrive in the digital era. We specialize in Cloud, DevOps, Data Science, Blockchain, IoT, and Metaverse technologies, and our team of experts is dedicated to delivering innovative solutions that drive business growth and transformation.Our team of experts has years of experience in above technologies. They are highly skilled, use the latest tools and technologies to design, develop, and implement solutions that transform businesses and drive innovation.What will your job look like4+ years of relevant experience in Hadoop with Scala DevelopmentIts mandatory that the candidate should have handled more than 2 projects in the above framework using Scala.Should have 4+ years of relevant experience in handling end to end Big Data technology.Meeting with the development team to assess the company’s big data infrastructure.Designing and coding Hadoop applications to analyze data collections.Creating data processing frameworks.Extracting data and isolating data clusters.Testing scripts and analyzing results.Troubleshooting application bugs.Maintaining the security of company data.Training staff on application useGood project management and communication skills.Designing, creating, and maintaining Scala-based applicationsParticipating in all architectural development tasks related to the application.Writing code in accordance with the app requirementsPerforming software analysisWorking as a member of a software development team to ensure that the program meets standardsApplication testing and debuggingMaking suggestions for enhancements to application procedures and infrastructure.Collaborating with cross-functional team12+ years of hands-on experience in variety of platform and data development roles5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architectureHigh Proficiency working with Hadoop platform including Spark/Scala, Kafka, SparkSQL, HBase, Impala, Hive and HDFS in multi-tenant environmentsSolid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data quality processes and toolsExperience in full lifecycle architecture guidanceAdvanced analytical thinking and problem-solving skillsAdvanced knowledge of application, data and infrastructure architecture disciplinesUnderstanding of architecture and design across all systemsDemonstrated and strong experience in a software engineering role, including the design, development and operation of distributed, fault-tolerant applications with attention to security, scalability, performance, availability and optimizationRequirements4+ years of hands-on experience in designing, building and supporting Hadoop Applications using Spark, Scala, Sqoop and Hive.Strong knowledge of working with large data sets and high capacity big data processing platform.Strong experience in Unix and Shell scripting.Experience using Source Code and Version Control Systems like Bitbucket, GitExperience working in an agile environmentStrong communication skills both verbal and written and strong relationship and collaborative skills and organizational skills with the ability to work as a member of matrix based diverse and geographically distributed project team.Apply Here
Are you legally authorized to work in the country of this job? Yes NoAre you willing to relocate for future career changes? Yes NoIf offered a position, would you be willing to submit to a criminal background check, a previous employment verification check, and/or an education verification check? Yes NoThis position is a relocation position, are you willing to relocate? Yes NoUpload Resume(Supported pdf, doc, docx ..... full job details .....