VIEW
SAVED
JOBS

Splunk Data Analytic Subject Matter Expert

2024-146149
Other / Technical
Public Trust

Location:

Woodlawn
,
MD

Secondary Location:

,
,

Telecommute Options:

Flexible for occasional telework – must be local to work location
Join Our Team
Apply now
right arrow
Share on your newsfeed or with a friend
About Peraton

Peraton is a next-generation national security company that drives missions of consequence spanning the globe and extending to the farthest reaches of the galaxy. As the world’s leading mission capability integrator and transformative enterprise IT provider, we deliver trusted, highly differentiated solutions and technologies to protect our nation and allies. Peraton operates at the critical nexus between traditional and nontraditional threats across all domains: land, sea, space, air, and cyberspace. The company serves as a valued partner to essential government agencies and supports every branch of the U.S. armed forces. Each day, our employees do the can’t be done by solving the most daunting challenges facing our customers. Visit peraton.com to learn how we’re keeping people around the world safe and secure.

Responsibilities

Peraton Global Health and Financial Solutions sector is seeking a Splunk Data Analytic Subject Matter Expert to join our team of qualified, diverse individuals. This position will be located in Woodlawn, MD. The candidate must reside close to be able to go into the office to work a hybrid schedule, when needed.

What you'll do:

 

This Splunk Data Analytic Subject Matter Expert (SME) will provide optimization of data flow using aggregation, filters, etc. The Splunk Data Analytic SME will be involved in the analysis of unstructured and semi-structured data, including latent semantic indexing (LSI), entity identification and tagging, complex event processing (CEP), and the application of analysis algorithms on distributed, clustered, and cloud-based high-performance infrastructures. The Subject Matter Expert will exercise creativity in applying non-traditional approaches to large-scale analysis of unstructured data in support of high-value use cases visualized through multi-dimensional interfaces. Handles processing and index requests against high-volume collections of data and high-velocity data streams. 
   

 

Duties and Responsibilities:

 

  • Create a consolidated data set that conforms to the common information model made up of sensor data sources that is already aggregated together and is also already searchable.
  • Develop the capability to aggregate all sensor data results based on two main categories: “tangible assets, namely hardware, software, and data” and “Information Systems, groups of assets with a business purpose.”
  • Develop the capability to tag new data so that it falls into the Re-Usable data assets model so that IO and CDM dashboard can ingest them.
  • Create a way to translate key value pairs from any sensor tools into the format needed to be consumed.
  • Transform already good data into the format needed for ingestion by Xacta.IO and CDM Elastic file.
  • Create data pipeline and create connections between data source(s) and the Re-Usable data asset model.
  • Create connection between Splunk and the Re-Usable data asset model.
  • Establish Xacta.IO data pipeline connection with the Re-Usable data asset model.
  • Establish CDM Elastic data pipeline connection with the Re-Usable data asset model.
  • Develop an integrator between Splunk and Xacta.IO and CDM Elastic.
  • Buildout Data Warehouses/ data models:
    • Tag Data
    • Buildout data pipelines in Splunk
    • Establish data pipeline connections
    • Develop Integrators/Integrations (between Splunk, DbConnect, Splunk, Xacta) 
    • Aggregate various types of data
    • Create Key Value pairs
    • ETL coding
    • Buildout Dashboards
  • Configure notable event actions, action menus and Adaptive Responses.
  • Data onboarding and data ingestion normalization recommendations.
  • Strong knowledge of security risk procedures, security patterns, authentication technologies and security attack pathologies.
  • Develop, evaluate, and document, specific metrics for management purposes.
  • Create Dashboards to monitor the traffic volumes, response times, errors, and warnings across various data centers.
  • Monitor the web portals, log files and databases.
  • Design and Develop Splunk for routine use.
  • Solve complex Integration challenges and debug complex configuration issues.
  • Consult with stakeholders to establish, maintain and refresh their strategic direction in cloud adoption.
  • Become knowledgeable on the CDM technical requirements for the federal government’s CDM program. Understand your role in CDM activities.
  • Involved in a wide range of security issues including architectures, firewalls, electronic data traffic, and network access.
  • Design, manage, and maintain enterprise SIEM infrastructure to improve data ingestion processes, including architectural work on data pipelines to ensure optimal flow of data.

Qualifications

Basic Qualifications:

  • Bachelor’s degree and 8 years of experience, Master's degree and 6 years of experience, or 12 years of experience in lieu of a degree.
  • 4 years’ experience using customer-focused Splunk Data Pipelining SIEM engineering background
  • 4 years’ experience in a senior Splunk role working in a Splunk clustered environment supporting SOC or NOC environments
  • A minimum of 4 years of experience with:
    • In-depth knowledge of designing, upgrading, maintaining, and implementing network devices on a large-scale enterprise
    • Direct experience with Splunk Engineering and data integration
    • Prior SIEM data modelling experience on similar platform at scale (>50 servers)
    • Scripting and development skills in Python/Perl with deep comprehension of regular expressions
    • Coordination and communication with other remotely deployed team members
    • Developing documentation with processes and procedures
    • Proposing, implementing automation features in a large enterprise environment
  • 3 years of experience with Linux and SQL/ODBC interfaces
  • 2 years of experience with data transport and transformation APIs and technologies such as JSON, XML, XSLT, JDBC, SOAP and REST.
  • Hold active Splunk Core Certifications of at least Splunk Architect
  • Minimum of 3 year of experience in developing and tailoring reporting from network security tools.
  • Must be able to obtain and maintain a US Public Trust clearance.
  • US Citizenship is required

Preferred Qualifications:

  • Experience with Splunk Common Information Model (CIM) and Enterprise Analytic.
  • Strong problem-solving abilities with an analytic and qualitative eye for reasoning under pressure.
  • Self-starter with the ability to independently prioritize and complete multiple tasks with little to no supervision.
  • Knowledge of Cloud Services such as AWS, Azure, Office365.
  • Ability to script in one more of the following computer languages Python, Bash, Visual Basic or Powershell.
  • Experience in automating Splunk Deployments and orchestration within a Cloud environment.

Target Salary Range

$112,000 - $179,000. This represents the typical salary range for this position based on experience and other factors.
SCA / Union / Intern Rate or Range

EEO

An Equal Opportunity Employer including Disability/Veteran.

Our Values

Benefits

At Peraton, our benefits are designed to help keep you at your best beyond the work you do with us daily. We’re fully committed to the growth of our employees. From fully comprehensive medical plans to tuition reimbursement, tuition assistance, and fertility treatment, we are there to support you all the way.

  • Paid Time-Off and Holidays
  • Retirement
  • Life & Disability Insurance
  • Career Development
  • Tuition Assistance and Student Loan Financing
  • Paid Parental Leave
  • Additional Benefits
  • Medical, Dental, & Vision Care
Happy man with his two children