Changing the world through digital experiences is what Adobe’s all about. We give everyone from emerging artists to global brands everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen.
We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity.
We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours!
Adobe Clouds enable customers to create & manage Digital content, such as assets, composites, 3D, documents etc., and Digital experience and transformations.
In Creative Cloud, Creative professionals and novice users alike need to manage lifecycle of their digital assets, libraries, the variety of creative content, and documents they work with every day, from brushes to colors, images , photos, videos, 3D and beyond.
In Experience Cloud, it is all about optimizing the Digital Experience and Digital Transformations for Enterprises where digital content rules with mobile playing pivotal role, whereas in Document Cloud it is all about paperless world where offerings provide way for authoring and seamless transfer of content across users & entities.
Adobe Cloud also provides the stock image marketplace, Adobe Stock, and the community, Behance , which entails deep machine learning embedding to enable content quality, search, discover, organize, contributor moderation, and more to allow for faster content velocity.
We are building a new centralized Sensei Content Platform to power machine learning, dataset management , and AI across Adobe Cloud product lines by enabling the world’s best creative tools, leading cloud services for managing digital assets and digital experiences and the leading marketplace such as Adobe Stock and Adobe Behance .
This platform will cater to thousands of applied researchers, millions of users, and billions of content pieces. Become part of this growing team at Adobe and a phenomenal impact around computer vision, user understanding, language understanding, and digital experience optimization.
The objective is to make machine learning offerings a world class, leading edge, differentiating product in Adobe Cloud ecosystem.
We match the pace, innovation, and excitement of a startup, backed by the resources and infrastructure of Adobe !
How can you participate ? We are looking for a Large-Scale Distributed Systems Engineer, with a good blend of science and engineering skills.
An experienced engineer with a proven track record of solving critical engineering problems through strong analytical / quantitative and engineering skills .
A hands-on person who has strong technical and communication skills and provides innovative technical solutions promptly .
This is an opportunity to make a huge impact in a fast-paced, startup-like environment in a great company. Join us!
Architect and design cloud infrastructure with effective computing and storage resources and efficient networking topology that meet enterprise security compliances.
Build and develop effective and maintainable toolkit for infrastructure provision and management on public cloud like AWS , Azure or G CP .
Design and develop SDK and backend services for emerging data capabilities.
Collaborate with clients, architects, product management and work with engineering focused, iterative team to build and establish product requirements
Explore and research new and emerging big data technologies and bring them to Sensei platform.
Work with QA and Onboarding engineers to understand the customer pain points and help bring resolution to the issues.
Write and review technical documents, including requirements and design documents for existing and continuously evolving features of the Sensei platform .
What you need to succeed
MS / PhD in Computer Science or related field
5+ years of industry experience in building and maintaining big data pipelines and / or building and maintaining analytical or reporting systems at scale.
Experience working with Apache Hadoop, Spark and related technology like Pig, Hive, Oozie etc.
Hands on experience in Python, Java and / or C++.
Experience with Docker, Containerization, AWS.
Experience in micro-services, and REST (Representational State Transfer) APIs .
Experience in RDBMS (Relational Database Management Systems) and NoSQL databases.
Understanding of state-of-the-art big data techniques, Machine Learning, Deep Learning, and Computer Vision
Proficient interpersonal and communication skills.