Initiate and execute complex operational workstreams intended to collect, cleanse, organize and store Experian’s supplier data within our information environment with highest integrity and efficiency possible.
Verifies input / output counts, monitors data processing jobs across platforms and audits / verifies the results based on historical trends.
Accountable for ownership of on-time input and output delivery, including taking initiative to monitor after hours. Accountable to manage their own time in order to meet deadlines, including taking Initiative to monitor work after hours.
Researches, troubleshoots and takes action regarding data or processing issues according to supplier, product or third-
party vendor specifications. This includes advising internal teams and stakeholders on necessary solution changes or fixes to implement as well as the actual work to implement / fix.
Document processes & procedures. Maintain policies and procedures to ensure conformance to all corporate and BIS-local requirements.
Also, help develop and enforce standards to ensure product-ready data integrity in business applications.
Help manage and support acquisition and integration of new data content into BIS’ data architecture, data model, business processes, and business model.
Set-up and build, implement / test, data-to-file processing flows including adhoc, new-build and more complex flows involving ETL components.
Evaluate and identify where system enhancements are required, performing impact analysis and preparing Level-of-Effort forecasts.
Technical Skills :
Between 2 and 5 years of experience in information technology industry with a focus in data management, data cleanup or similar activities and / or responsibilities.
Bachelor’s Degree in a relevant field or equivalent experience.
Data storage, integration, ETL design and build skills using tools such as IBM Infosphere, Informatica, Pentaho, Talend, DB2, Hadoop.
Planning and collaboration skills using tools such as Atlassian Confluence, Trello, Jira, Cisco Webex.
Automation skills using tools such as Jenkins, Chef, Puppet, Kubernetes / Docker, JSP.
Basic development / scripting skills using such languages as Java. Python, AngularJS, Anaconda, R
Testing skills using tools such as Selenium with Robot Framework, IBM Rational Tester, Tricentis Tosca, Unified Functional Test (UFT).
Source Code Management skills using tools such as Clearcase, Endevor, Atlassian Bitbucket, GitHub.
Cross-platform job / application monitoring skills using tools such as Splunk, Dynatrace, CA-APM, IBM-APM.
Data manipulation & visualization skills using tools such as Tableau, Alteryx, SAS, Talend, Pentaho, R, Python, SQL, MySQL, NoSQL, JSON, Hive
Feedback and issue management skills using tools such as JIRA, Confluence, Service Now.
Microsoft Office general applications, including Visio.
Knowledge of managed file transfer, enterprise file synch-and-share protocols and encryption methods in on-premise, cloud and big data environments is beneficial.