Data Engineer-Delivery
Verisk
Escazu, San José Province, cr
hace 5 días

Company Description

We help the world see new possibilities and inspire change for better tomorrows. Our analytic solutions bridge content, data, and analytics to help business, people, and society become stronger, more resilient, and sustainable.

Job Description

In this role you will be responsible for expanding and optimizing our data pipeline architecture. The Data Engineer will support Product Delivery, Engineering, Analytics, and all of our data scientists on various initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.

ESSENTIAL FUNCTIONS :

  • Help architect and maintain our code base for ETL pipelines, large batch processing, and streaming systems
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Create and maintain toolsets to help manage our enterprise data pipelines
  • Identify, design, and implement internal process improvements : automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Ensure our code meets quality standards and architectural requirements
  • Help build out the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data’ technologies.
  • Comfortable writing stories and associated acceptance criteria for agile / scrum workflow
  • Help keep our data separated and secure across multiple data centers and AWS regions.
  • Qualifications

  • Experience working under agile / scrum and the major pieces of that framework (epics, stories, sprint planning and retros),
  • Intermediate / Advanced knowledge of Python
  • Experience with and knowledge of large batch data processing
  • Experience with SQL and NoSQL databases, and an understanding of key concepts (Foreign Keys, Indices, etc)
  • Experience with Big Data technologies, preferably PySpark / Spark, Airflow, or Pandas
  • Experience seeing the full end to end SDLC of a system to production.
  • Basic understanding of bash scripting, and the basics of Linux operating systems.
  • Understanding of Microservice architecture and preferably familiarity with 12 Factor architecture.
  • Experience with CI / CD, such as Jenkins + Gitlab
  • 1-2 years experience developing and deploying solutions on AWS and familiarity with AWS' ecosystem of services
  • Experience with Docker or Kubernetes a plus
  • Experience with stream-processing systems a plus
  • Experience with BDD / TDD is a plus
  • LI-RM2

    Reportar esta oferta
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Inscribirse
    Mi Correo Electrónico
    Al hacer clic en la opción "Continuar", doy mi consentimiento para que neuvoo procese mis datos de conformidad con lo establecido en su Política de privacidad . Puedo darme de baja o retirar mi autorización en cualquier momento.
    Continuar
    Formulario de postulación