Experian IT Services is looking to fill multiple Big Data Administrator positions to implement and support big data platforms.
As part of this team you will be required to work with cross-functional teams & help leverage innovative technologies such as Hadoop, Spark, Cassandra, etc.
to build next generation products.
As a Big Data Administrator you will setup, configure, & maintain infrastructure globally. You will also be responsible for deployments, performance tuning, capacity planning, problem resolution, change management and security within the big-
data platforms. You will also help define standards & best practices, build & manage highly available systems that are scalable, deploy appropriate monitors for alerting etc.
It is expected that you will work pro-actively to improve the evolving environment to meet business requirements.
If you have the skills and can do attitude, we would love to talk to you!
Work with Architecture & BU teams to build, install, configure, and maintain big data platforms based on architecture specifications & business requirements.
Perform operational duties such as Spark / Hadoop / Cassandra cluster upgrades, patching, monitoring jobs, capacity management, etc.
Interface with infrastructure, network, security and operations teams to resolve problems with application / Big Data systems.
Pro-actively evaluate evolving technologies and recommend solutions to business problems Provide production support 24x7 rotation.
2 years of experience in Hadoop and NoSQL technologies. Production implementation experience desired. (Cloudera Hadoop, Spark, Cassandra, Solr, etc.)
Experience of application software development life cycle. Should be able to influence application projects and provide guidance if needed.
Must have strong working knowledge of Big Data components such as YARN, HDFS, HBase, Spark, etc.
Experience in automating deployments and monitoring / alerting tasks
Knowledge of programming & scripting languages a plus Java, C, Python, R, Perl, Hive, MapReduce etc.
Experience of cloud infrastructure such as AWS, Openshift etc.
Working Knowledge of YARN, HBase, Hive, Spark, Flume, Kafka etc.
Strong Problem Solving and creative thinking skills
Ability to multi-task and manage multiple assignments simultaneously
Effective oral and written communications
Experience working with geographically distributed teams
Bachelors or Master’s degree in Computer Science or equivalent experience