Hadoop Engineer Job Description Template

As part of our ongoing modernization journey, we are looking for a Hadoop Engineer to better our technological environment. The main focus of this role will be to implement on-premises upgrades, automate laborious legacy processes, and move to new data centers and the public cloud as part of our modernization strategy (AWS). You will have the chance to learn about the other toolkits and stacks, regardless of whether you are passionate about Web App (JavaScript, ReactJS, Spring boot), Big Data (Hadoop, Spark), conventional relational databases (Oracle, SQL Server), AWS, or any other related tool sets or stacks. We would like to hear from candidates who can assist with automation initiatives, have expertise with AWS, and are eager to learn as we move toward modernisation together. Candidates that have a constant curiosity about technology will discover that our team is full of opportunities for learning, developing skill sets, and continuing professional development as a techie.

Typical Duties and Responsibilities

  • Assess the company’s big data infrastructure in collaboration with the development team
  • Design and code Hadoop applications 
  • Troubleshoot and test scripts and applications and analyze results
  • Create data processing frameworks and data tracking programs
  • Produce Hadoop development documentation
  • Maintain the security of company data

Education

  • Bachelor’s degree in engineering, business, computer science, information systems, mathematics, or a related field

Required Skills and Experience

  • Knowledge of relational enterprise databases (Oracle Preferred)
  • Expertise in at least one current programming language (Java, Python, Spark)
  • Expertise in SQL and principles related to query optimization (HIVE, PL/SQL)
  • Experience in Windows scripting or Unix shell scripting (PowerShell, Batch scripts)
  • Experience using Oracle ETL Scripts to create and maintain ETL processes
  • Knowledge of data modeling and analysis, including models for process automation
  • Working knowledge of SDLC, CI, and CD execution (GitHub, Jenkins, SNOR, Spinnaker, AIM etc.)
  • Expertise in the fields of infrastructure, data, and application architecture
  • Knowledge of all systems’ architecture and design
  • Working knowledge of programming languages and code management
  • Expertise in current technology developments and best practices across industries
  • Capability to collaborate in big teams to accomplish organizational goals
  • Knowledge of computer programming techniques like business analysis, development, maintenance, and software enhancement

Preferred Qualifications

  • AWS expertise or certification 
  • Experience with technology-based AWS Data Analytics
  • Understanding of SQOOP
  • Knowledge of relational business databases
  • Knowledge of converting on-premises data workflows to public cloud
  • Experience planning applications like Autosys
  • Knowledge of and experience with API development
  • Proven BI analytical skills (Tableau, Alteryx)
Contact us

Recruit with Nexus IT Group