Barbaricum is seeking an experienced Data Engineer to provide support to a Department of Defense (DoD) Office of Inspector General (OIG) customer. The Data Engineer will integrate as part of a team focused on developing data efficiencies to facilitate the OIG’s ability to discharge its oversight role of identifying waste, fraud and abuse across the DoD enterprise. This individual will leverage experience and expertise in data engineering tools, with an emphasis on Hadoop and R, to lead the architecting of a new framework and solution to efficiently and effectively track large amounts of data relevant to the OIG’s oversight function, with particular implications for supply chain security across DoD.

Responsibilities

  • Lead with creation, troubleshooting, and deployment of analytical processes that span across multiple software platforms including Cloudera Hadoop, R, Spark, Tableau, and SAS

  • Investigate and resolve collaboration system deterioration and/or failures, and develops techniques to resolve hardware, software, and networking problems in system components

  • Ability to maintain elevated rights on the DoD OIG network

  • Analyze collaboration system performance, benchmark data, and system configuration for maximum performance

  • Assist with regular training sessions and maintain open office hours for teams to directly ask for analytics software support Qualifications

    • Active DoD Secret Clearance
  • Bachelor’s degree in Computer Science, Engineering, or related field and 10+ years of relevant experience

  • Demonstrated proficiency in data management systems and statistical packages such as Cloudera Hadoop and/or R

  • Direct experience creating sustainable, automated processes for data analysis

  • Expert at understanding and creating high level architectural specifications

  • Advanced technical expertise with programmatically manipulating data

  • Must be able to acquire and maintain one of the following certifications: CompTIA Security +, SCNP, or a SSCP Desired Qualifications

    • Experience with Hive, PIG, Sqoop, and Flume
  • Experience with Natural Language Processing (NLP) within R, Spark, SAS programming