Location: Ft. Meade, Maryland, United States (Full-Time)

CANDIDATES ARE REQUIRED TO HAVE AN ACTIVE TS/SCI FULL SCOPE WITH POLYGRAPH TO BE CONSIDERED FOR THE POSITION.

Required Qualifications:

  • Shall have at least eight (8) years experience in software development/engineering, including requirements analysis, software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution. 
  • Shall have demonstrated experience working with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as Hbase, CloudBase/Acumulo, Big Table, etc. 
  • Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc. 
  • Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS). 
  • Shall have demonstrated work experience with serialization such as JSON and/or BSON. 
  • Shall have demonstrated work experience in the requirements analysis and design of at least one Object Oriented system. 
  • Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products. 
  • Shall have at least three (3) years experience in software integration and software testing, to include developing and implementing test plans and test scripts. 
  • Shall have demonstrated technical writing skills and shall have generated technical documents in support of software development project. 
  • Experience developing and deploying: data driven analytics; event driven analytics; sets of analytics orchestrated through rules engines.
  • Experience developing and deploying: analytics that include foreign language processing; analytic processes that incorporate/integrate multi-media technologies, including speech, text, image and video exploitation.
  • Experience with analytics that function on massive data sets, for example more than a billion rows or larger than 10 Petabytes; analytics that employ semantic relationships (i.e., inference engines).
  • Experience with structured and unstructured data sets; analytics that identify latent patterns between elements of massive data sets, for example more than a billion rows or larger than 10 Petabytes.
  • Experience with analytics that employ techniques commonly associated with Artificial Intelligence, for example genetic algorithms. 
  • Shall have at least six (6) years of experience developing software with high level languages such as Java, C, C++. 
  • Shall have demonstrated work experience developing Restful services. 
  • Shall have at least five (5) years experience developing software for Windows (2000, 2003, XP, VISTA) or UNIX/Linux (Redhat versions 3-5) operating systems. 
  • Experience designing and developing automated analytic software, techniques, and algorithms.
  • Experience with taxonomy construction for analytic disciplines, knowledge areas and skills.
  • Experience developing and deploying analytics that discover and exploit social networks.
  • Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications. 
  • Experience developing and deploying analytics within a heterogeneous schema environment.
  • Experience with linguistics (grammar, morphology, concepts).  Understanding of Big-Data Cloud Scalability (Amazon, Google, Facebook).
  • Must have a current Hadoop/Cloud Developer Certification.

NOTE:  A degree in Communications, Computer Science, Mathematics, Accounting, Information Systems, Program Management, or similar degree will be considered as a technical field.