Apply for this jobReference: TQ2122_2136_27 Vacancy: 3 Job title: Software Developer – Big Data / Hadoop Location: Leicester
TESTQ Technologies is an IT services and solutions company whose offerings spans over variety of industry sectors with strong technical, domain and process expertise helping clients grow their businesses and decrease operational costs on continuous basis in an ever-changing business environment.
This opportunity is in the solution design and development arena for Software Developer – Big Data / Hadoop who will play a major role with the technical design and development of company’s technical offerings. The role is based at our Leicester office with occasional assignments at client locations
Job Description (Main Duties and Responsibilities):
• You will work with/lead a team of outstanding software engineers to help build and run great software solutions, applications, infrastructure and products in areas of Big Data e.g., Hadoop • Should have developed a scalable way of storing and processing mixed set of data to improve the user experience • Should have implemented Hadoop to handle massive data consumption and aggregation using map-reduce programming model • Should have handled data warehouse features like partitioning, sampling and indexing • Work with technical architects, test leads, and DevOps colleagues to create the best solutions and to ensure the smooth transition from development through to production and take responsibility for the support of the applications you build, including occasional support outside of office hours. • Responsible for understanding your own work area, be able to quickly get up to speed with issues. • Strong stakeholder management skills • Engage with the technical architects to ensure designs are appropriate and then ensure those designs are implemented by the team. • Ensure the team are producing high quality, well-tested software that conforms to all technical guidance and standards. • Ensure any reusable code created and any innovative work done within the team is communicated. • Share knowledge of tools and techniques with the wider team of developers and non-developers. • Be involved in some of organisation wide tasks like recruitment process, presales, technical support and process improvement programs
Key Skills, Qualifications and Experience Needed [The candidate must demonstrate these in all stages of assessment]
• You should have demonstratable skills in designing and deploying overall Hadoop ecosystem in Cloud including Hadoop Common; Hadoop Distributed File System (HDFS); Hadoop YARN; Hadoop MapReduce and Hadoop Ozone and peripheral systems including Hive, Pig, Apache Spark, Flume and Sqoop • Pre-processing using Hive and Pig. • High-speed querying. • Managing and deploying HBase. • Experience with Spark and stream-processing systems • Experience with various messaging systems, such as Kafka or RabbitMQ • Any further exposure to following technologies would be an added advantage o Cloudera Hadoop(CDH), Cloudera Manager, Informatica Bigdata Edition(BDM), HDFS, Yarn, MapReduce, Hive, Impala, KUDU, Sqoop, Spark, Kafka, HBase, Teradata Studio Express, Teradata, Tableau, Kerberos, Active Directory, Sentry, TLS/SSL, Linux/RHEL, Unix Windows, SBT, Maven, Jenkins, Oracle, MS SQL Server, Shell Scripting, Eclipse IDE, Git, SVN
The candidate with Bachelor's degree or above in the UK or Equivalent would be preferred.
Other Key skills: • Good analytical and Problem-solving skills • Good communication skills • A thorough approach and Self starter • Focus on quality and delivery • Working together in teams • Leadership and effective decision making • Flexible Attitude
Qualifications: Bachelor's degree or above in the UK or Equivalent. Salary: £28000 - £38000 per annum Published Date: 16 Apr 2021 Closing Date: 16 May 2021 Evaluation: CV Review, Technical Test, Personal and Technical Interview and References Job Type: Full-time, Permanent [Part time and Fixed Term option is available]Apply for this job