Role: Hadoop
Architect
Location: Charlotte
Work Authorization: EAD, GC, USC
·
Develops, enhances, debugs,
supports, maintains and tests software applications that support business units
or supporting functions. These application program solutions may involve
diverse development platforms, software, hardware, technologies and tools.
Participates in the design, development and implementation of complex
applications using new technologies. May provide technical direction and system
architecture for individual initiatives. Serves as a fully seasoned/proficient
technical resource. May collaborate with external programmers to coordinate
delivery of software application. Routine accountability is for technical
knowledge and capabilities. Should be able to work independently under minimal
supervision, with general guidance from more seasoned consultants. The
candidate is expected to liaise with the business analysts, and other
technology delivery manager.
·
Solid understanding on OOP
languages and must have working experience in C++, core java, J2EE
Should have good Knowledge on Hadoop Cluster
Architecture
Hands on Hadoop and the Hadoop ecosystem
required - Proven experience within CLOUDERA Hadoop ecosystems (MR1,MR2, HDFS,
YARN, Hive, HBase, Sqoop, Pig, Hue, etc.)
Design and implement Apache Spark based real
time stream processing data pipeline involving complex data processing
·
Hands-on experience developing
applications using Big Data, Kafka, Cassandra, Apache Storm, Apache Spark and
related areas
Implement complex data processing algorithms in
real time with optimized and efficient manner using Scala/Java
·
Knowledge of any one of the
scripting languages, such as Python, Unix Shell Scripting or PERL etc., is
essential for this position.
Excellent analytical & problem solving
skills, willingness to take ownership and resolve technical challenges
·
Experience in performing
Proof-Of-concept for new technologies
Must have experience working with external
Venders/Partners like Cloudera, Datastax etc
Strong communication, documentation skills &
technology awareness and capability to interact with technology leaders is a
must
·
Good knowledge on Agile
Methodology and the Scrum process.
Bachelor’s degree in Science or Engineering
9+ year of Industry experience.
Minimum 5 - 7 years of Big Data experience