Senior Data Engineer

Location Andover
Job type: Permanent
Salary: £60k (c)
Contact name: Louise Howard
Contact email: louise@requireconsultancy.com
Job ref: REF: 8432
Published: 5 months ago
Expiry date: 22 Mar 2024 23:59

​​As a Senior Data Engineer specialising in the creation of distributed data systems and the management of expansive data warehouses, you will take the lead on crafting, deploying, and maintaining advanced cloud-based data infrastructures. These systems are crucial for meeting our client's increasing needs for data-driven decision-making and process automation.

Core Responsibilities:

  • Architect, build, and maintain efficient, scalable data pipelines using Python and SQL.

  • Design, test, and implement data workflows and ETL (Extract, Transform, Load) processes within containerized Azure environments.

  • Work closely with data scientists and analysts to address complex data challenges.

  • Optimize and manage databases and data pipelines for peak performance.

  • Establish data management and governance best practices.

  • Engage in code reviews, contribute to testing, and oversee deployment initiatives.

  • Address potential security vulnerabilities within data management systems.

  • Troubleshoot and rectify issues with data pipelines, databases, and reporting.

  • Enhance data reliability, efficiency, and quality.

  • Lead significant engineering initiatives and manage key projects.

  • Drive database management and optimization efforts.

  • Promote continuous learning and development in data engineering within the team.

  • Advocate for high standards in coding, system development, and design.

  • Utilize version control tools effectively.

Required Skills and Experience:

  • A bachelor’s or master’s degree in computer science, engineering, or related fields is preferred, but significant professional experience can substitute for formal education.

  • Extensive data engineering background.

  • Proficient in SQL and experienced with relational databases and a variety of database technologies.

  • Skilled in developing and managing large-scale data pipelines and datasets.

  • Able to analyse and work with unstructured data sets.

  • Expertise in Python programming.

  • Deep knowledge of data modelling, database design, and optimization.

  • Proven ability with Azure cloud services development.

  • Familiarity with containerization technologies.

  • Proficient in using APIs for data extraction.

  • Excellent problem-solving abilities and teamwork skills.

  • Strong communication and interpersonal prowess.

Desirable Qualifications:

  • Knowledge of big data tools like Spark and Kafka.

  • Experience with pipeline and workflow management tools such as Azkaban, Luigi, Airflow.

  • Understanding of machine learning Python libraries like TensorFlow and PyTorch.

  • Proficiency in data visualization tools, e.g., Power BI or Tableau.

The role is based on site at an impressive Head Quarters in Andover, Hampshire. The company also offers some excellent benefits to all new recruits. They are an evolving business so this is a prime opportunity to be part of an expanding operation. To enquire please contact Louise who is handling all of their recruitment activity.