Data Engineer III
Catalina
San José, Provincia de San José
hace 1 día

Our Team

The EDM team defines and builds the core capabilities that the Catalina Products and Data Solutions leverage in market. We build capabilities once and re-use across our solution suites.

The team provides the data which is used to extract insight and drive our market models to provide value to our clients.

Catalina is seeking a Data Engineer III to help us build our new Data platform by developing cutting edge, data-driven solutions for retailer and CPG customers.

This is a great opportunity for a passionate data engineer with experience in digital marketing technology and a desire to launch innovative ad tech solutions.

The data engineer will be participating in developing / unit testing / documenting and deploying data driven solutions. The successful candidate possesses strategic thinking skills, a passion for product development and a proven background developing new products and scaling existing ones.

This position requires an engaging, innovative and collaborative approach. The candidate must demonstrate the ability to effectively work with other leaders across multiple functional areas.

The Platform

The EDM team is looking to create a new data platform to ingest millions of records from our vast retailer network. This platform will leverage DaaS principles to allow for self-service, scalable and easily accessible environments.

The team needs capable Data Engineers to design cutting edge solutions using PaaS offerings as well as open source technologies.

This opportunity will allow engineers to participate in tool selection, design review, code development (ingestion, transformation and consumption layer) and implementation into on-prem and / or cloud environments.

This data will then be leverage for business insights and shopper behavioral analysis.

Responsibilities

  • Develop Data Solutions using the following tool / technologies (HDFS / Azure Data lake, Hive, Databricks, Spark(Scala), Azure tool suite (Data Factory, EventHub, Azure Devops), and other open source technologies) resulting in stable and high-quality code within deadlines, following established process.
  • Knowledgeable in Cloud Data Warehouse solutions like Snowflake and Synapse
  • Maintain in-depth knowledge of data ecosystem and trends; be a subject matter expert and thought leader.
  • Participate in peer review code sessions to ensure quality of code.
  • Actively participate in SCRUM ceremonies ensuring the velocity of the team continues to improve and work becomes more streamlined.
  • Actively participate in performance tuning to maximize resources.
  • Track and resolve data issues showing creative problem solving skills.
  • Clearly communicate with management on proposed solutions / challenges.
  • Document solutions following company standards and clearly communicate designs.
  • Support production environment and previously deployed solutions.
  • Recommends new tools to improve current stack and solve data needs efficiently.
  • Develop / unit test and deploy data solutions (high complexity).
  • Present solutions to product owners / managers.
  • Participate in talent acquisitions tasks such as resume review / interviews.
  • Develop new standards such as Data Recipes, Checklists, Deployment Standards, etc.
  • Qualifications

  • 5-9+ years experience with Data Solution on some of this technology (HDFS / Azure Data lake, Hive, Databricks, Spark(Scala), Azure tool suite (Data Factory, EventHub, Azure Devops),
  • 5-9+ years experience with Linux / Unix Systems, scripting.
  • 5-9+ years experience with RDBMS and DWH Solutions (Netezza, Snowflake, Synapse)
  • Experience in work automation.
  • Working knowledge of Agile software engineering processes.
  • Able to develop / unit test and deploy complex data solutions.
  • Able to mentor fellow co-workers to expand overall team skills.
  • Positive attitude towards challenges.
  • Advanced communication skills to present solutions clearly to the team and users.
  • Process mapping experience
  • Experience working with Azure Data Factory and other Scheduling Tools.
  • Experience with programming tools (Python, Scala)
  • Community developer presence (github, apache, open source projects, etc)
  • Inscribirse
    Mi Correo Electrónico
    Al hacer clic en la opción "Continuar", doy mi consentimiento para que neuvoo procese mis datos de conformidad con lo establecido en su Política de privacidad . Puedo darme de baja o retirar mi autorización en cualquier momento.
    Continuar
    Formulario de postulación