OverviewAs the Data Engineer II, you will work in a global environment and provide significant individual contribution to the enterprise application integration efforts.
You will own the end to end data integration. Candidate should be able to develop business domain understanding and should be able to correlate data with the real-
world business challenges About the TeamThe Application Architecture & Integration Services team, part of the IT Organization, is responsible for democratizing data in order to enable an adaptive enterprise.
The team is comprised of talented and cross-functional engineers that build an application network and a single source of truth, with secure, reusable integrations and APIs.
We facilitate Akamai’s ability to have a data driven culture. Our platform is comprised of Mulesoft, Oracle Data Integration (ODI), Exadata and other open-
source technologies and DevOps servicesResponsibilities
Design, develop and support ETL mappings using Oracle Data Integrator for the company wide integration needs
Gather requirements for new integration needs and provide optimized design
Write optimized SQLs for data extraction, performance optimization and integration
Reconcile and debug the data flows by following the data lineage
Creation of a self-service driven data engineering and analytics platform
Ability to work in multiple programming languages such as Python, Java etc.
Be adaptive to the new technologies and leverage the best fit technology for a use-case
Ability to act as a change agent and as a facilitator to get things done Basic Qualifications MTech / ME / BTech / BE in Computer Science or related field Proficient understanding of ODI, SQL, ETL design, data modelling and modern integration architectures Ability to automate the manual tasks and improve efficiency of code and platform 3+ years’ experience in developing ETL experience with ODI as a preferred tool 3+ years’ experience in developing SQL code 2+ years’ experience working in Unix and Cron scripting Understanding of Oracle DB and ODI internals Excellent verbal and written communication skillsDesired Qualifications Streaming ETL using KAFKA or any other related tool Programming experience in any OOPS language such as Java / Python Ability to scale up to Big Data environment