Kafka Sme - London, United Kingdom - Legal & General

Tom O´Connor

Posted by:

Tom O´Connor

beBee Recruiter


Description

Legal & General are currently looking for a
true Kafka Subject Matter Expert / Lead Kafka Data Engineer to lead the development of Kafka real-time data streaming platforms.

You will be the
SME for Kafka consumer, producer, and cluster software.

You will
work collaboratively with Data Engineers and Solution Architects to ensure the data integrity, data consistency, and availability of end-to-end real-time data streaming platforms.


Key duties:

  • Design, develop and maintain Kafka realtime data streaming platform software.
  • Retain accountability for Kafka realtime data streaming software development and Kafka cluster configuration throughout the project lifecycle, ensuring that established software engineering best practices and relevant group standards are met.
  • Work within an agile methodology, providing input into project and resource plans and support business case production.
  • Work with Data and Support Engineers in Root Cause Analysis of defects.
  • Work with Test Automation Engineers to define Acceptance Criteria, build test scenarios, write test cases, and run tests within the Software Development Lifecycle.
  • Identify issues and risks and lead their resolution as required.
  • Identify and implement opportunities for innovation and continuous improvement, by conducting analysis of Kafka cluster and Confluent platform software components, Kafka producers and Kafka consumers, documenting the findings, with pros and cons, and recommendations.
  • Demonstrable
    programming skills in writing business critical code which integrates with Kafka; proficient in more than one programming language including
    Python, Java, Perl, C+ +, or C.
  • Demonstrable
    software engineering skills; understands and promotes software engineering methodologies such as
    BDD, TDD, Extreme programming.
  • Demonstrable knowledge of the
    configuration and operation of Kafka realtime data streaming
    clusters (Confluent Cloud Kafka desirable); understands concepts of Kafka and Confluent Platform components, configuration, tuning, data integrity, recovery.
  • Demonstrable understanding of
    Authentication and Authorisation; tooling and frameworks, specifications
    SASL/SCRAM, TLS client authN and experience of implementing.
  • Demonstrable knowledge of
    data transformation techniques and data storage formats; JSON, Avro,
    unstructured and structured data; ANSI SQL skills and
    knowledge of RDBMS relational data concepts.
  • Demonstrable knowledge of working with
    DevOps tools and techniques for rapid deployment to AWS Cloud (preferable) or Azure/GCP Cloud; Git version control usage and conceptual understanding essential, shell script and DSL, HSL etc essential.
  • Demonstrable knowledge of writing
    RESTful APIsor clients which interact with RESTful APIs, understanding the OpenAPI specification.
  • Experience in
    highly regulated environments, such as healthcare or financial services.
  • Confluent Certified Developer for Apache Kafka (
    CCDAK) and/ or Confluent Certified Administrator for Apache Kafka (
    CCAAK) would be highly desirable.
  • Experience of integrating Kafka with
    Change Data Capture (CDC) software, e.g. Attunity/Qlik Replicate or CR8, as a Kafka Producer would be advantageous.

More jobs from Legal & General