Job Openings >> Cloud Data Engineer (Top Secret Clearance) - US Citizens Only
Cloud Data Engineer (Top Secret Clearance) - US Citizens Only
Summary
Title:Cloud Data Engineer (Top Secret Clearance) - US Citizens Only
ID:5332
Location:USA
Department:Information Technology
Description

Only U.S. Citizens with Top Secret Clearance should apply.
FIDES is looking for a Data Engineer with an active Top Secret Clearance and prior cloud experience.

The Cloud Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and infrastructure. The successful candidate will have a solid foundation in data engineering principles, substantial experience with technologies including AWS, Databricks and Microsoft, and a proven track record of implementing robust data solutions. Responsibilities for this role include:

  • Collaborate with cross-functional teams to understand data requirements and design appropriate solutions using AWS/Databricks/Microsoft.
  • Develop and maintain efficient ETL pipelines to ingest, process, and transform large volumes of data.
  • Implement data quality checks and monitoring processes to ensure data integrity and reliability.
  • Optimize and tune Databricks clusters for performance and scalability.
  • Work closely with data scientists and analysts to support their data needs and enable advanced analytics.

Requirements

  • Bachelor's degree in Computer Science, Engineering, or related field.
  • 3-5 years of experience in data engineering and big data technologies.
  • Active Top Secret (TS) clearance with SCI eligibility. 
  • Experience supporting the Department of Defense (DoD) or Intelligence Community (IC).
  • Experience with Databricks, Python, and SQL.
  • Strong knowledge of data warehousing concepts and best practices.
  • Hands-on experience deploying and maintaining data pipelines using Databricks or similar tools.
  • Familiarity with cloud platforms such as AWS, Azure, or Google Cloud Platform.

Preferred Qualifications

  • Cloud certifications and other relevant certifications.
  • Experience with other big data technologies such as Apache Spark, Hadoop, or Kafka.
  • Knowledge of containerization technologies such as Docker and Kubernetes.
  • Experience with version control systems such as Git.
ApplicantStack powered by Swipeclock