Elasticsearch Data Engineer (Secret) - Federal Staffing Solutions Inc.
Fort George G Meade, MD 20755
About the Job
Qualifications:
- Education/Experience: 8+ years of experience and a B.S. in Engineering, Computer Science, Mathematics, Statistics, Physics, Electrical Engineering, Computer Engineering, Data Science, or Data Analytics. Additional experience may be accepted in lieu of degree.
- Data Management & Collaboration: Proficiency with Data Management platforms and strong communication skills for effective collaboration with virtual teams of data engineers and DevOps engineers.
- Software Development Lifecycle: Experience following a software development lifecycle, with the ability to develop and maintain production-quality code.
- Security Clearance: Ability to obtain interim Secret DoD Security clearance before the start date.
Preferred Qualifications:
- Data Automation: Experience automating data cleansing, formatting, staging, and transformation processes.
- Text Mining & ELK Stack: Proficiency with text mining tools, summarization, search (ELK Stack), entity extraction, training set generation, and anomaly detection.
- CI/CD & Containerization: Familiarity with CI/CD techniques for developing and releasing software through containerized pipelines.
- BI Tools & Search Analytics: Knowledge of BI tools (e.g., Kibana, Splunk) and experience with developing search and analytics applications.
- Big Data Technologies: Experience with Elasticsearch, Logstash, Kibana, Kafka, ksql, NiFi, Apache Spark, ServiceNow.
- Elastic Engineer Certification: Certified Elastic Engineer with experience developing logstash and ingest pipelines.
- Experience developing in Confluent ksql and kstreams for data ETL purposes.
- Kubernetes Expertise: Familiarity with Kubernetes and deployment of containers.
- Agile Processes: Experience with Agile methodologies and related tools.
Essential Requirements: US Citizenship is required. Active Secret
Physical Demands: The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job with or without reasonable accommodation. While performing the duties of this job, the employee will regularly sit, walk, stand and climb stairs and steps. May require walking long distance from parking to work station. Occasionally, movement that requires twisting at the neck and/or trunk more than the average person, squatting/ stooping/kneeling, reaching above the head, and forward motion will be required. The employee will continuously be required to repeat the same hand, arm, or finger motion many times. Manual and finger dexterity are essential to this position. Specific vision abilities required by this job include close, distance, depth perception and telling differences among colors. The employee must be able to communicate through speech with clients and public. Hearing requirements include conversation in both quiet and noisy environments. Lifting may require floor to waist, waist to shoulder, or shoulder to overhead movement of up to 20 pounds. This position demands tolerance for various levels of mental stress.
Job Duties:
- Data Analysis & Problem Solving: Analyze quantitative and qualitative data to solve stakeholder problems and improve business efficiency.
- oDesign, implement, and document solutions as repeatable processes. –
- ETL Pipeline Development: Perform extraction, transformation, and load (ETL) tasks. Develop and integrate data sets from diverse environments to support use cases involving network, performance, application, and configuration data. –
- Data Modeling & Management: Develop, test, and maintain both physical and logical data models. Ensure consistency, quality, accuracy, and security of data by managing relevant metadata in support of the project.
- oIdentify and resolve Elasticsearch issues, including slow queries and indexing problems. –
- Adherence to Governance & SecDevOps: Follow GMS Data Governance and SecDevOps policies to develop, test, deploy, and maintain data engineering pipelines. –
- Cross-Team Collaboration: Work within a matrixed organization to collaborate with primary project leadership while maintaining standard practices with GMS core teams.
- o Combine software and data engineering practices to strengthen enterprise data governance. –
- System Architecture & Data Transformation: Apply knowledge of system architecture, network, and Centralized Logging (ELK) to support data transformation efforts.
- Data Analytics & Visualization: Secure, maintain, optimize, and document analytics and visualization solutions, including some design and build responsibilities.
- Agile Practices: Follow Agile scrum practices in daily operations.
- Elastic Cluster Management: Deploy and manage Elastic clusters on Kubernetes in both on-premise and cloud environments.
- Platform Expansion: Expand data platforms and analytics solutions using Elastic and Confluent platforms, focusing on SATCOM metadata for dashboard and reporting visualizations.
- Customer Visualization Support: Support customer-driven visualization requirements and collaborate on data integration and Kibana dashboard development.
Qualifications: • Education/Experience: 8+ years of experience and a B.S. in Engineering, Computer Science, Mathematics, Statistics, Physics, Electrical Engineering, Computer Engineering, Data Science, or Data Analytics. Additional experience may be accepted in lieu of degree. • Data Management & Collaboration: Proficiency with Data Management platforms and strong communication skills for effective collaboration with virtual teams of data engineers and DevOps engineers. • Software Development Lifecycle: Experience following a software development lifecycle, with the ability to develop and maintain production-quality code. – • Security Clearance: Ability to obtain interim Secret DoD Security clearance before the start date. Preferred Qualifications: - • Data Automation: Experience automating data cleansing, formatting, staging, and transformation processes. – • Text Mining & ELK Stack: Proficiency with text mining tools, summarization, search (ELK Stack), entity extraction, training set generation, and anomaly detection. – • CI/CD & Containerization: Familiarity with CI/CD techniques for developing and releasing software through containerized pipelines. – • BI Tools & Search Analytics: Knowledge of BI tools (e.g., Kibana, Splunk) and experience with developing search and analytics applications. – • Big Data Technologies: Experience with Elasticsearch, Logstash, Kibana, Kafka, ksql, NiFi, Apache Spark, ServiceNow. – • Elastic Engineer Certification: Certified Elastic Engineer with experience developing logstash and ingest pipelines. – • Experience developing in Confluent ksql and kstreams for data ETL purposes. – • Kubernetes Expertise: Familiarity with Kubernetes and deployment of containers. – • Agile Processes: Experience with Agile methodologies and related tools. Essential Requirements: US Citizenship is required. Active Secret