• Apache and Hadoop Developer

    Location US-AL-Redstone Arsenal, Huntsville
    Posted Date 2 months ago(4/4/2018 8:39 AM)
    Job ID
    # Positions
    Job Location
    Huntsville, AL
    Experience (Years)
    Security Eligibility Type
  • Overview

    Clearance Level:   Active DoD Secret or Interim Clearance

    Job Title:        Apache Hadoop Developer

    Location:  Redstone Arsenal, Huntsville, Alabama,

    US Army Material Command Logistics Support Activity (LOGSA)



    Employment Type:   Full-Time


    Job Description:

    In this exciting role you will be an Apache Hadoop Developer with Hive, and devise solutions for solving complex customer problems using company or customer furnished computer systems and equipment or commercial off-the-shelf packages. You will collaborate with technical staff to understand and to develop new solutions or the resolution of software problems, and use expertise to design develop, code, test, and debug software.


    Additional Info:

    • JBM is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, or disability status.
    • Physical work location is client/government site (Redstone Arsenal) in Huntsville, Alabama.
    • Subject to a government security investigation and must meet eligibility requirements for access to classified information.



    • Review Software Requirements – 10%
    • Develop and write code – 80%
    • Test software code - 5%
    • Participate in meetings conferences - 5%


    Required Technical and Professional Expertise:

    • Hands on experience with Apache NiFi and/or Hive software OR HDPCD - Hortonworks Data Platform Certified Developer and/or HDPCA - Hortonworks Data Flow Certified Administrator.
    • Sound knowledge of relational databases (SQL)
    • Experience with large SQL based systems like Teradata, Oracle and Unix/Linux shell scripting is a plus
    • Familiar with industry best practices and how to drive efficiently while maintaining a robust service offering

    Preferred Tech and Prof Experience:

    • Experience in designing, reviewing, implementing and optimizing data transformation processes in Apache Hadoop.
    • Ability to consolidate, validate and cleanse data from a vast range of sources – from applications and databases to files and Web services.
    • Capable of extracting data from an existing databases, flat files, web sources or APIs.
    • Experience designing and implementing fast and efficient data acquisition using Big Data processing techniques and tools.
    • Java development experience.
    • Backend NiFi ETL transformation in Java experience.
    • Experience migrating data from Oracle to Apache Hadoop file system.
    • Front end Hive or BeeLine SQL development experience.
    • Experience converting existing Oracle stored procedures into JDBC queries.
    • IBM DataStage or Informatica PowerCenter experience.
    • Experience in Apache Hadoop platforms like Hortonworks, Cloudera or MapR.



    Security Eligibility Type



    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed