Data Engineer - Databricks
TradeStation
Heredia
hace 13 horas

TradeStation is an

online brokerage firm seeking to level the playing field for self-directed investors and traders, empowering them to claim their individual financial edge.

At TradeStation, we're continuously pushing the boundaries of what's possible, encouraging out-of-the-box thinking and relentless search for innovation.

We offer a collaborative work environment, competitive salaries, comprehensive benefits and a generous PTO policy. The Enterprise Analytics team is building a unique AI & ML powered Solutions to augment our Equity & Crypto Trading platform that will empower institutional & retail investors to gain real time insights that multiply trading gains while managing risk and improving compliance.

We are looking for a Databricks Engineer with a passion for working with big and fast data. You will work in a dynamic highly challenging and always evolving environment.

You will help evolve our data lakehouse as part of our Enterprise Analytics team and act as subject matter expert for workspace administration and development on pyspark.

If you are an enthusiastic learner who can think creatively, raise questions, and research items, you can bring your passion and skills to solve interesting problems and have an immediate impact. Join us!

This position can be fully remote - work from home! ESSENTIAL JOB FUNCTIONS :

  • Write Spark code to process, transform and ensure quality on many datasets
  • Build & maintain Databricks workspace infrastructure on AWS
  • Apply test driven development to big data processing
  • Build CI-CD pipelines with GitLab and Databricks CLI
  • Consume data from streaming as well as batch sources
  • Build necessary guardrails to keep services operational and secure
  • Build templates and tools to accelerate development
  • Work in a DevOps environment, where development teams own both the development and operational responsibilities
  • KNOWLEDGE, SKILLS & CORE TECHNOLOGIES : Required :

  • Code development for data workflows on Spark (Python, Scala and / or Java)
  • Big Data infrastructure on Apache Spark (e.g. Delta, Databricks, Data Lakes, Data Warehouses, Data Lakehouse)
  • Bash scripts and command line interfaces (CLI) for Databricks and AWS.
  • Agile environment with DevOps utilizing CI-CD tools (e.g. GitLab CI, Azure DevOps, Jenkins)
  • Understanding of Agile SDLC, Change Management
  • Good oral and written communication skills
  • Preferred :

  • Brokerage / trading domain knowledge and experience
  • AWS Certified Solutions Architect Associate
  • Knowledge and experience of current trends for DevOps applied to data as a product (e.g. Continuous Learning, Medallion architecture)
  • Configuration management and deployment automation tools (e.g. CI / CD Pipelines, Octopus, Ansible, Puppet, Chef)
  • EDUCATION & EXPERIENCE :

  • Bachelor’s Degree in Computer Science / Engineering or equivalent work experience
  • Reportar esta oferta
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Inscribirse
    Mi Correo Electrónico
    Al hacer clic en la opción "Continuar", doy mi consentimiento para que neuvoo procese mis datos de conformidad con lo establecido en su Política de privacidad . Puedo darme de baja o retirar mi autorización en cualquier momento.
    Continuar
    Formulario de postulación