Post job

Data engineer jobs in Virginia

- 4,388 jobs
  • Mid-level DevOps Engineer (TS/SCI)

    Vantor

    Data engineer job in Herndon, VA

    Vantor is forging the new frontier of spatial intelligence, helping decision makers and operators navigate what's happening now and shape what's coming next. Vantor is a place for problem solvers, changemakers, and go-getters-where people are working together to help our customers see the world differently, and in doing so, be seen differently. Come be part of a mission, not just a job, where you can: Shape your own future, build the next big thing, and change the world. To be eligible for this position, you must be a U.S. Person, defined as a U.S. citizen, permanent resident, Asylee, or Refugee. Note on Cleared Roles: If this position requires an active U.S. Government security clearance, applicants who do not currently hold the required clearance will not be eligible for consideration. Employment for cleared roles is contingent upon verification of clearance status. Export Control/ITAR: Certain roles may be subject to U.S. export control laws, requiring U.S. person status as defined by 8 U.S.C. 1324b(a)(3). Please review the job details below. Vantor is seeking a Mid-Level DevOps Engineer located in Herndon, VA, to support the development, integration, and cybersecurity compliance of various intelligence capabilities into a Development and subsequently Production environment. You will be a member of a project team responsible for designing and maintaining multiple DevOps pipelines integrating and deploying numerous components into the larger system technology stack. Deployment will be across multiple networks into private cloud infrastructures self-hosted for our government customer. As a Vantor team member, you will closely support our mission partners, and your work will have direct mission impact. In this role, you'll work closely with other DevOps engineers, software developers, infrastructure technicians and cybersecurity professionals leveraging open-source technology to create and maintain the full stack of a High-Performance Computing (HPC) system that hosts a diverse range of applications served out to thousands of users. About Us: We are a multi-faceted technical team with expertise spanning the full stack of cloud computing technologies including systems administration, DevOps, cybersecurity, software development, and systems engineering that builds and maintains software applications backed by a self-managed cloud infrastructure with a true big-data footprint (over 10 petabytes). Our diverse background of experience in mission support and technology development and operations serves as a catalyst to solve unique and challenging intelligence problems in support of special operations analysts and their on-going activities. Prototyping and frequent, iterative feedback are core to our delivery approach, anchored by a need to work quickly in support of our missions. Principal Responsibilities: Deploy, and manage highly available applications to ensure system reliability and scalability. Implement, and maintain HashiCorp Nomad clusters and Kubernetes clusters for workload orchestration. Execute DNS configuration, management, and performance tuning for enterprise-grade systems. Develop and implement Infrastructure-as-Code (IaC) solutions using tools like Terraform, Ansible, or similar. Build and manage multiple CI/CD pipelines with GitLab or equivalent tools to automate deployments and streamline development workflows for rapid development and integration. Perform system monitoring, logging, and troubleshooting to proactively identify and resolve issues. Automate security testing and monitoring within the DevOps workflows using ACAS and Trivy. Analyze cybersecurity scan findings and work with the cybersecurity team to identify false positive findings. Assist the cybersecurity team in assembling the required Body of Evidence to submit for False Positive exceptions. Assist the cybersecurity team in assembling the required Body of Evidence to submit containerized and non-containerized software packages for enterprise software approval. Integrate static code analysis tools such as GitLab SAST, Fortify or Sonarqube and other security mechanisms into CI/CD pipelines. Build and maintain custom tools to automate cybersecurity analysis and correlation workflows as new cybersecurity compliance requirements emerge. Perform cybersecurity remediation on DevOps managed Virtual Machines to include OS patching, OS STIGing, and software package updates. Build, maintain, and monitor configuration management of release products. Troubleshoot and resolve network, automation pipelines, and infrastructure issues. Minimum Requirements: Must have a current/active TS/SCI and be willing and able to pass a CI polygraph. Must be able and willing to work in a SCIF environment in Herndon Virginia forty hours a week. Minimum of 5+ years of Industry experience as a DevOps engineer. Strong expertise in cloud environments, including deployment, optimization, and troubleshooting. Proven track record in building and operating Cloud Native Applications using tools like Kubernetes and Docker. Experience in managing, integrating and utilizing PKI certificates for user and Non-Person-Entity Authentication and Authorization. In-depth experience with IaC tools such as Terraform, Ansible, or equivalent. Solid experience in creating and managing CI/CD build pipelines using GitLab or similar tools (e.g., Jenkins, Azure DevOps). Strong scripting and automation skills using Python, Bash, or equivalent languages. Excellent problem-solving skills with attention to detail and the ability to thrive in a fast-paced environment. Experience with security best practices in DevOps pipelines (e.g., Trivy, Grype, GtiLab SAST, Sonarqube, etc.). Familiarity with monitoring tools like Prometheus, Grafana, or ELK Stack. Strong knowledge of networking and load balancing technologies. Experience with source configuration management tools such as Git. CI & CD development experience with technologies like Bash, Jenkins, or GitLab. Experience with automated deployment technologies such as Cloud Formation, Ansible, Puppet or Chef. Cloud Technologies deployment experience. Open-source application deployments and maintenance. Custom source application deployments and maintenance. Moderate LINUX system administration experience (RedHat, Rocky Linux, Alma Linux or similar)). Working knowledge of Linux and Windows operating systems, web services and SQL databases. Experience working in an Agile environment. Desired Skills: Bachelor's degree in Computer Science, Information Systems or related discipline. Master's degree in Computer Science, Information Systems or related discipline. Security+ or comparable certification for privileged user access. Experience with distributed processing methods and tools, such as REST APIs, microservices, IaaS/PaaS services. Experience developing and deploying web services. Experience in implementing Docker STIGs Experience with technical cybersecurity remediation in the context of a Continuous Monitoring (CONMON) program RHCSA or LPIC1/LPIC2 certifications or equivalent Certified Kubernetes Administrator certification Docker Certified Associate #LI-CJ1 #cjpost #LI-Onsite Pay Transparency: In support of pay transparency at Vantor, we disclose salary ranges on all U.S. job postings. The successful candidate's starting pay will fall within the salary range provided below and is determined based on job-related factors, including, but not limited to, the experience, qualifications, knowledge, skills, geographic work location, and market conditions. Candidates with the minimum necessary experience, qualifications, knowledge, and skillsets for the position should not expect to receive the upper end of the pay range. ● The base pay for this position within the Washington, DC metropolitan area is: $113,000.00 - $188,000.00 annually. For all other states, we use geographic cost of labor as an input to develop market-driven ranges for our roles, and as such, each location where we hire may have a different range. Benefits: Vantor offers a competitive total rewards package that goes beyond the standard, including a robust 401(k) with company match, mental health resources, and unique perks like student loan repayment assistance, adoption reimbursement and pet insurance to support all aspects of your life. You can find more information on our benefits at: ****************************** The application window is three days from the date the job is posted and will remain posted until a qualified candidate has been identified for hire. If the job is reposted regardless of reason, it will remain posted three days from the date the job is reposted and will remain reposted until a qualified candidate has been identified for hire. The date of posting can be found on Vantor's Career page at the top of each job posting. To apply, submit your application via Vantor's Career page. EEO Policy: Vantor is an equal opportunity employer committed to an inclusive workplace. We believe in fostering an environment where all team members feel respected, valued, and encouraged to share their ideas. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, gender identity, sexual orientation, disability, protected veteran status, age, or any other characteristic protected by law.
    $113k-188k yearly 6d ago
  • Cybersecurity Engineer III **

    Simventions, Inc.-Glassdoor ✪ 4.6

    Data engineer job in Virginia Beach, VA

    SimVentions, consistently voted one Virginia's Best Places to Work, is looking for an experienced cybersecurity professional to join our team! As a Cybersecurity Engineer III, you will play a key role in advancing cybersecurity operations by performing in-depth system hardening, vulnerability assessment, and security compliance activities in accordance with DoD requirements. The ideal candidate will have a solid foundation in cybersecurity practices and proven experience supporting both Linux and Windows environments across DoD networks. You will work collaboratively with Blue Team, Red Team, and other Cybersecurity professionals on overall cyber readiness defense and system accreditation efforts. ** Position is contingent upon award of contract, anticipated in December of 2025. ** Clearance: An ACTIVE Secret clearance (IT Level II Tier 5 / Special-Sensitive Position) is required for this position. Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information. US Citizenship is required to obtain a clearance. Requirements: In-depth understanding of computer security, military system specifications, and DoD cybersecurity policies Strong ability to communicate clearly and succinctly in written and oral presentations Must possess one of the following DoD 8570.01-M IAT Level III baseline certifications: CASP+ CE CCNP Security CISA CISSP (Associate) CISSP GCED GCIH CCSP Responsibilities: Develop Assessment and Authorization (A&A) packages for various systems Develop and maintain security documentation such as: Authorization Boundary Diagram System Hardware/Software/Information Flow System Security Plan Privacy Impact Assessment e-Authentication Implementation Plan System Level Continuous Monitoring Plan Ports, Protocols and Services Registration Plan of Action and Milestones (POA&M) Conduct annual FISMA assessments Perform Continuous Monitoring of Authorized Systems Generate and update test plans; conduct testing of the system components using the Assured Compliance Assessment Solution (ACAS) tool, implement Security Technical Implementation Guides (STIG), and conduct Information Assurance Vulnerability Management (IAVM) reviews Perform automated ACAS scanning, STIG, SCAP checks (Evaluate STIG, Tenable Nessus, etc.) on various standalone and networked systems Analyze cybersecurity test scan results and develop/assist with documenting open findings in the Plan of Action and Milestones (POA&M) Analyze DISA Security Technical Implementation Guide test results and develop/assist with documenting open findings in the Plan of Action and Milestones Preferred Skills and Experience: A combined total of ten (10) years of full-time professional experience in all of the following functional areas: Computer security, military system specifications, and DoD cybersecurity policies National Cyber Range Complex (NCRC) Total Ship Computing Environment (TSCE) Program requirements and mission, ship install requirements, and protocols (preferred) Risk Management Framework (RMF), and the implementation of Cybersecurity and IA boundary defense techniques and various IA-enabled appliances. Examples of these appliances and applications are Firewalls, Intrusion Detection System (IDS), Intrusion Prevention System (IPS), Switch/Routers, Cross Domain Solutions (CDS), EMASS and, Endpoint Security Solution (ESS) Performing STIG implementation Performing vulnerability assessments with the ACAS tool Remediating vulnerability findings to include implementing vendor patches on both Linux and Windows Operating systems Education: Bachelor of Science in Information Systems, Bachelor of Science in Information Technology, Bachelor of Science in Computer Science, Bachelor of Science in Computer Engineering Compensation: Compensation at SimVentions is determined by a number of factors, including, but not limited to, the candidate's experience, education, training, security clearance, work location, skills, knowledge, and competencies, as well as alignment with our corporate compensation plan and contract specific requirements. The projected annual compensation range for this position is $90,000 - $140,000 (USD). This estimate reflects the standard salary range for this position and is just one component of the total compensation package that SimVentions offers. Benefits: At SimVentions, we're committed to supporting the total well-being of our employees and their families. Our benefit offerings include comprehensive health and welfare plans to serve a variety of needs. We offer: Medical, dental, vision, and prescription drug coverage Employee Stock Ownership Plan (ESOP) Competitive 401(k) programs Retirement and Financial Counselors Health Savings and Health Reimbursement Accounts Flexible Spending Accounts Life insurance, short- & long-term disability Continuing Education Assistance Paid Time Off, Paid Holidays, Paid Leave (e.g., Maternity, Paternity, Jury Duty, Bereavement, Military) Third Party Employee Assistance Program that offers emotional and lifestyle well-being services, to include free counseling Supplemental Benefit Program Why Work for SimVentions?: SimVentions is about more than just being a place to work with other growth-orientated technically exceptional experts. It's also a fun place to work. Our family-friendly atmosphere encourages our employee-owners to imagine, create, explore, discover, and do great things together. Support Our Warfighters SimVentions is a proud supporter of the U.S. military, and we take pride in our ability to provide relevant, game-changing solutions to our armed men and women around the world. Drive Customer Success We deliver innovative products and solutions that go beyond the expected. This means you can expect to work with a team that will allow you to grow, have a voice, and make an impact. Get Involved in Giving Back We believe a well-rounded company starts with well-rounded employees, which is why we offer diverse service opportunities for our team throughout the year. Build Innovative Technology SimVentions takes pride in its innovative and cutting-edge technology, so you can be sure that whatever project you work on, you will be having a direct impact on our customer's success. Work with Brilliant People We don't just hire the smartest people; we seek experienced, creative individuals who are passionate about their work and thrive in our unique culture. Create Meaningful Solutions We are trusted partners with our customers and are provided challenging and meaningful requirements to help them solve. Employees who join SimVentions will enjoy additional perks like: Employee Ownership: Work with the best and help build YOUR company! Family focus: Work for a team that recognizes the importance of family time. Culture: Add to our culture of technical excellence and collaboration. Dress code: Business casual, we like to be comfortable while we work. Resources: Excellent facilities, tools, and training opportunities to grow in your field. Open communication: Work in an environment where your voice matters. Corporate Fellowship: Opportunities to participate in company sports teams and employee-led interest groups for personal and professional development. Employee Appreciation: Multiple corporate events throughout the year, including Holiday Events, Company Picnic, Imagineering Day, and more. Founding Partner of the FredNats Baseball team: Equitable distribution of tickets for every home game to be enjoyed by our employee-owners and their families from our private suite. Food: We have a lot of food around here! FTAC
    $90k-140k yearly 5d ago
  • Data Engineer - AI & Data Modernization

    Guidehouse 3.7company rating

    Data engineer job in Arlington, VA

    Job Family: Data Science Consulting Travel Required: Up to 25% Clearance Required: Ability to Obtain Public Trust What You Will Do: We are seeking an experienced Data Engineer to join our growing AI and Data practice, with a dedicated focus on the Federal Civilian Agencies (FCA) market within the Communities, Energy & Infrastructure (CEI) segment. This individual will be a hands-on technical contributor, responsible for designing and implementing scalable data pipelines and interactive dashboards that enable federal clients to achieve mission outcomes, operational efficiency, and digital transformation. This is a strategic delivery role for someone who thrives at the intersection of data engineering, cloud platforms, and public sector analytics. Client Leadership & Delivery Collaborate with FCA clients to understand data architecture and reporting needs. Lead the development of ETL pipelines and dashboard integrations using Databricks and Tableau. Ensure delivery excellence and measurable outcomes across data migration and visualization efforts. Solution Development & Innovation Design and implement scalable ETL/ELT pipelines using Spark, SQL, and Python. Develop and optimize Tableau dashboards aligned with federal reporting standards. Apply AI/ML tools to automate metadata extraction, clustering, and dashboard scripting. Practice & Team Leadership Work closely with data architects, analysts, and cloud engineers to deliver integrated solutions. Support documentation, testing, and deployment of data products. Mentor junior developers and contribute to reusable frameworks and accelerators. What You Will Need: US Citizenship is required Bachelor's degree is required Minimum TWO (2) years of experience in data engineering and dashboard development Proven experience with Databricks, Tableau, and cloud platforms (AWS, Azure) Strong proficiency in SQL, Python, and Spark Experience building ETL pipelines and integrating data sources into reporting platforms Familiarity with data governance, metadata, and compliance frameworks Excellent communication, facilitation, and stakeholder engagement skills What Would Be Nice To Have: AI/LLM Certifications Experience working with FCA clients such as DOT, GSA, USDA, or similar Familiarity with federal contracting and procurement processes What We Offer: Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace. Benefits include: Medical, Rx, Dental & Vision Insurance Personal and Family Sick Time & Company Paid Holidays Position may be eligible for a discretionary variable incentive bonus Parental Leave and Adoption Assistance 401(k) Retirement Plan Basic Life & Supplemental Life Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts Short-Term & Long-Term Disability Student Loan PayDown Tuition Reimbursement, Personal Development & Learning Opportunities Skills Development & Certifications Employee Referral Program Corporate Sponsored Events & Community Outreach Emergency Back-Up Childcare Program Mobility Stipend About Guidehouse Guidehouse is an Equal Opportunity Employer-Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation. Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco. If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at ************** or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation. All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or ************************. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process. If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse's Ethics Hotline. If you want to check the validity of correspondence you have received, please contact *************************. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant's dealings with unauthorized third parties. Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.
    $73k-97k yearly est. Auto-Apply 1d ago
  • Data Scientist - ML, Python

    Avance Consulting 4.4company rating

    Data engineer job in McLean, VA

    10+years of experience required in Information Technology. Python Programming: At least 5 years of hands-on experience with Python, particularly in frameworks like FastAPI, Django, Flask, and experience using AI frameworks. • Access Control Expertise: Strong understanding of access control models such as Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC). • API and Connector Development: Experience in developing API connectors using Python for extracting and managing access control data from platforms like Azure, SharePoint, Java, .NET, WordPress, etc. • AI and Machine Learning: Hands-on experience integrating AI into applications for automating tasks such as access control reviews and identifying anomalies • Cloud and Microsoft Technologies: Proficiency with Azure services, Microsoft Graph API, and experience integrating Python applications with Azure for access control reviews and reporting. • Reporting and Visualization: Experience using reporting libraries in Python (Pandas, Matplotlib, Plotly, Dash) to build dashboards and reports related to security and access control metrics. • Communication Skills: Ability to collaborate with various stakeholders, explain complex technical solutions, and deliver high-quality solutions on time. • PlainID: Experience or familiarity with PlainID platforms for identity and access management. • Azure OpenAI: Familiarity with Azure OpenAI technologies and their application in access control and security workflows. • Power BI: Experience with Microsoft Power BI for data visualization and reporting. • Agile Methodologies: Experience working in Agile environments and familiarity with Scrum methodologies for delivering security solutions.
    $76k-111k yearly est. 5d ago
  • Data Engineer

    Pyramid Consulting, Inc. 4.1company rating

    Data engineer job in McLean, VA

    Immediate need for a talented Data Engineer. This is a 12 months contract opportunity with long-term potential and is located in Mclean, VA(Hybrid). Please review the job description below and contact me ASAP if you are interested. Job ID: 25-93504 Pay Range: $70 - $75/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location). Key Responsibilities: Design, develop, and maintain data pipelines leveraging Python, Spark/PySpark, and cloud-native services. Build and optimize data workflows, ETL processes, and transformations for large-scale structured and semi-structured datasets. Write advanced and efficient SQL queries against Snowflake, including joins, window functions, and performance tuning. Develop backend and automation tools using Golang and/or Python as needed. Implement scalable, secure, and high-quality data solutions across AWS services such as S3, Lambda, Glue, Step Functions, EMR, and CloudWatch. Troubleshoot complex production data issues, including pipeline failures, data quality gaps, and cloud environment challenges. Perform root-cause analysis and implement automation to prevent recurring issues. Collaborate with data scientists, analysts, platform engineers, and product teams to enable reliable, high-quality data access. Ensure compliance with enterprise governance, data quality, and cloud security standards. Participate in Agile ceremonies, code reviews, and DevOps practices to ensure high engineering quality. Key Requirements and Technology Experience: Skills-Data Engineer- Python , Spark/PySpark, AWS, Golang, Able to write complex SQL queries against Snowflake tables / Troubleshoot issues, Java/Python, AWS (Glue, EC2, Lambda). Proficiency in Python with experience building scalable data pipelines or ETL processes. Strong hands-on experience with Spark/PySpark for distributed data processing. Experience writing complex SQL queries (Snowflake preferred), including optimization and performance tuning. Working knowledge of AWS cloud services used in data engineering (S3, Glue, Lambda, EMR, Step Functions, CloudWatch, IAM). Experience with Golang for scripting, backend services, or performance-critical processes. Strong debugging, troubleshooting, and analytical skills across cloud and data ecosystems. Familiarity with CI/CD workflows, Git, and automated testing. Our client is a leading Banking and Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration. Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
    $70-75 hourly 1d ago
  • Data Engineer

    The Ash Group

    Data engineer job in Falls Church, VA

    *** W2 Contract Only - No C2C - No 3rd Parties *** The Ash Group is hiring a new Programmer Analyst Principal (Data Engineer) for our client (a global leader providing advanced systems and support in defense, aerospace, and security) based in Falls Church, VA. In this role, you'll be designing, implementing, and optimizing large-scale data systems and ETL pipelines, with a strong focus on using Amazon Redshift and AWS services to ensure data quality and integrity for complex defense programs. Compensation, Benefits, and Role Info Competitive pay rate of $65 per hour. Medical, dental, vision, direct primary care benefits, and, after six months of employment, a 4% matched 401(k) plan with immediate 100% vesting. Type: 12-month contract with potential extension or conversion. Location: On-site in Falls Church, VA. What You'll Be Doing Design and implement large-scale ETL data pipelines using AWS Glue and Python/PySpark to ingest, transform, and load data from various sources. Build and maintain robust data warehouses, focusing on Amazon Redshift, including data modeling and governance. Write and optimize complex, highly-performant SQL queries across large datasets (Redshift, Oracle, SQL Server). Collaborate with cross-functional teams (data scientists, analysts) to understand requirements and deliver end-to-end data solutions. Troubleshoot, optimize performance, and resolve data-related issues like pipeline failures and data quality bottlenecks. What We're Looking For 8+ years of hands-on experience in data engineering, focusing on designing and implementing large-scale data systems. 5+ years of experience in building production-level ETL pipelines using AWS Glue and Python/PySpark. Deep proficiency in SQL, including query optimization, indexing, and performance tuning across data warehouses like Amazon Redshift. Strong understanding of database design principles, data modeling (star/snowflake schemas), and data governance. Experience with data processing/orchestration frameworks such as Apache Airflow, Apache Kafka, or Fivetran. If you're a seasoned data engineering professional passionate about building scalable data solutions and driving innovation in cloud-based environments, we want to hear from you. This is an exciting opportunity to work on cutting-edge technologies, collaborate with cross-functional teams, and make a meaningful impact on data-driven decision-making. Apply now to be part of a forward-thinking organization where your expertise will shape the future of our data infrastructure. #DataEngineer #DataEngineering #AWSEngineer #Redshift #ETL #PySpark #DataPipeline #Westminster #ColoradoJobs #Contract
    $65 hourly 3d ago
  • Data Engineer

    Brooksource 4.1company rating

    Data engineer job in Richmond, VA

    Data Engineer - Distributed Energy Resources (DER) Richmond, VA - Hybrid (1 week on - 1 week off) 12-month contract (Multiple Year Project) $45-55/hr. depending on experience We are hiring a Data Integration Engineer to join one of our Fortune 500 utilities partners in the Richmond area! In this role, you will support our client's rapidly growing Distributed Energy Resources (DER) and Virtual Power Plant (VPP) initiatives. You will be responsible for integrating data across platforms such as Salesforce, GIS, SAP, Oracle, and Snowflake to build our client's centralized asset tracking system for thermostats, EV chargers, solar assets, home batteries, and more. In this role, you will map data, work with APIs, support Agile product squads, and help design system integrations that enable our client to manage customer energy assets and demand response programs at scale. This is a highly visible position on a brand-new product team with the chance to work on cutting-edge energy and utility modernization efforts. If you are interested, please apply! MINIMUM QUALIFICATIONS: 3-5 years of experience in system integration, data engineering, or data warehousing and Bachelor's degree in Computer Science, Engineering, or related technical discipline. Hands-on experience working with REST APIs and integrating enterprise systems. Strong understanding of data structures, data types, and data mapping. Familiarity with Snowflake or similar data warehousing platform. Experience connecting data across platforms and/or integrating data from a variety of sources, i.e. SAP, Oracle, etc. Ability to work independently and solve problems in a fast-paced Agile environment. Excellent communication skills with the ability to collaborate across IT, business, engineering, and product teams. RESPONSIBILITIES: Integrate and map data across Salesforce, GIS, Snowflake, SAP, Oracle, and other enterprise systems Link distributed energy asset data (EV chargers, thermostats, solar, home batteries, etc.) into a unified asset tracking database Support API-first integrations: consuming, analyzing, and working with RESTful services Participate in Agile ceremonies and work through user stories in Jira Collaborate with product owners, BAs, data analysts, architects, and engineers to translate requirements into actionable technical tasks Support architecture activities such as identifying data sources, formats, mappings, and integration patterns Help design and optimize integration workflows across new and existing platforms Work within newly formed Agile product squads focused on VPP/Asset Tracking and Customer Segmentation Troubleshoot integration issues and identify long-term solutions Contribute to building net-new systems and tools as the client expands DER offerings NICE TO HAVES: Experience with Salesforce. Experience working with GIS systems or spatial data. Understanding customer enrollment systems. Jira experience. WHAT'S IN IT FOR YOU…? Joining our client provides you the opportunity to join a brand-new Agile product squad, work on high-impact energy modernization and DER initiatives, and gain exposure to new technologies and integration tools. This is a long-term contract with strong likelihood of extension in a stable industry and company. Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state, and local laws.
    $45-55 hourly 4d ago
  • Principal Data Scientist with Gen AI

    Lorvenk Technologies

    Data engineer job in McLean, VA

    Title: Principal Data Scientist with Gen AI Contract: W2 Exp: 10+ Duration: Long Term Interview Mode: In-Person interview Call Notes: Looking for a Principal Data Scientist with strong focus on Generative AI (GenAI) with expertise in Machine Learning transitioned into GenAI. Need someone with good experience in RAG, Python- Jupyter, other Software knowledge, using agents in workflows, strong understanding of data. Someone with advanced proficiency in Prompt Engineering, Large Language Models (LLMs), RAG, Graph RAG, MCP, A2A, multi-modal AI, Gen AI Patterns, Evaluation Frameworks, Guardrails, data curation, and AWS cloud deployments. Highly preferred for someone who can built AI agent, MCP, A2A, Graph Rag, deployed Gen AI applications to production. Top Skills: Machine Learning & Deep Learning - Required GenAI - Required Python - Required Rag and/or Graph Rag - Required MCP (Model Context Protocol) and A2A (Agent-to-Agent) is highly preferred Job Description: We are seeking a highly experienced **Principal Gen AI Scientist** with a strong focus on **Generative AI (GenAI)** to lead the design and development of cutting-edge AI Agents, Agentic Workflows and Gen AI Applications that solve complex business problems. This role requires advanced proficiency in Prompt Engineering, Large Language Models (LLMs), RAG, Graph RAG, MCP, A2A, multi-modal AI, Gen AI Patterns, Evaluation Frameworks, Guardrails, data curation, and AWS cloud deployments. You will serve as a hands-on Gen AI (data) scientist and critical thought leader, working alongside full stack developers, UX designers, product managers and data engineers to shape and implement enterprise-grade Gen AI solutions. Key Responsibilities: * Architect and implement scalable AI Agents, Agentic Workflows and GenAI applications to address diverse and complex business use cases. * Develop, fine-tune, and optimize lightweight LLMs; lead the evaluation and adaptation of models such as Claude (Anthropic), Azure OpenAI, and open-source alternatives. * Design and deploy Retrieval-Augmented Generation (RAG) and Graph RAG systems using vector databases and knowledge bases. * Curate enterprise data using connectors integrated with AWS Bedrock's Knowledge Base/Elastic * Implement solutions leveraging MCP (Model Context Protocol) and A2A (Agent-to-Agent) communication. * Build and maintain Jupyter-based notebooks using platforms like SageMaker and MLFlow/Kubeflow on Kubernetes (EKS). * Collaborate with cross-functional teams of UI and microservice engineers, designers, and data engineers to build full-stack Gen AI experiences. * Integrate GenAI solutions with enterprise platforms via API-based methods and GenAI standardized patterns. * Establish and enforce validation procedures with Evaluation Frameworks, bias mitigation, safety protocols, and guardrails for production-ready deployment. * Design & build robust ingestion pipelines that extract, chunk, enrich, and anonymize data from PDFs, video, and audio sources for use in LLM-powered workflows-leveraging best practices like semantic chunking and privacy controls * Orchestrate multimodal pipelines** using scalable frameworks (e.g., Apache Spark, PySpark) for automated ETL/ELT workflows appropriate for unstructured media * Implement embeddings drives-map media content to vector representations using embedding models, and integrate with vector stores (AWS KnowledgeBase/Elastic/Mongo Atlas) to support RAG architectures Required Qualifications:** * 10+ years of experience in AI/ML, with 3+ years in applied GenAI or LLM-based solutions. * Deep expertise in prompt engineering, fine-tuning, RAG, GraphRAG, vector databases (e.g., AWS KnowledgeBase / Elastic), and multi-modal models. * Proven experience with cloud-native AI development (AWS SageMaker, Bedrock, MLFlow on EKS). * Strong programming skills in Python and ML libraries (Transformers, LangChain, etc.). * Deep understanding of Gen AI system patterns and architectural best practices, Evaluation Frameworks * Demonstrated ability to work in cross-functional agile teams. * Need Github Code Repository Link for each candidate. Please thoroughly vet the candidates. **Preferred Qualifications:** * Published contributions or patents in AI/ML/LLM domains. * Hands-on experience with enterprise AI governance and ethical deployment frameworks. * Familiarity with CI/CD practices for ML Ops and scalable inference APIs.
    $73k-102k yearly est. 4d ago
  • Data Engineer (Zero Trust)

    Kavaliro 4.2company rating

    Data engineer job in Fort Belvoir, VA

    Kavaliro is seeking a Zero Trust Security Architect / Data Engineer to support a mission-critical program by integrating secure architecture principles, strengthening data security, and advancing Zero Trust initiatives across the enterprise. Key Responsibilities Develop and implement program protection planning, including IT supply chain security, anti-tampering methods, and risk management aligned to DoD Zero Trust Architecture. Apply secure system design tools, automated analysis methods, and architectural frameworks to build resilient, least-privilege, continuously monitored environments. Integrate Zero Trust Data Pillar capabilities-data labeling, tagging, classification, encryption at rest/in transit, access policy definition, monitoring, and auditing. Analyze and interpret data from multiple structured and unstructured sources to support decision-making and identify anomalies or vulnerabilities. Assess cybersecurity principles, threats, and vulnerabilities impacting enterprise data systems, including risks such as corruption, exfiltration, and denial-of-service. Support systems engineering activities, ensuring secure integration of technologies and alignment with Zero Trust operational objectives. Design and maintain secure network architectures that balance security controls, mission requirements, and operational tradeoffs. Generate queries, algorithms, and reports to evaluate data structures, identify patterns, and improve system integrity and performance. Ensure compliance with organizational cybersecurity requirements, particularly confidentiality, integrity, availability, authentication, and non-repudiation. Evaluate impacts of cybersecurity lapses and implement safeguards to protect mission-critical data systems. Structure, format, and present data effectively across tools, dashboards, and reporting platforms. Maintain knowledge of enterprise information security architecture and database systems to support secure data flow and system design. Requirements Active TS/SCI security clearance (required). Deep knowledge of Zero Trust principles (never trust, always verify; explicit authentication; least privilege; continuous monitoring). Experience with program protection planning, IT supply chain risk management, and anti-tampering techniques. Strong understanding of cybersecurity principles, CIA triad requirements, and data-focused threats (corruption, exfiltration, denial-of-service). Proficiency in secure system design, automated systems analysis tools, and systems engineering processes. Ability to work with structured and unstructured data, including developing queries, algorithms, and analytical reports. Knowledge of database systems, enterprise information security architecture, and data structuring/presentation techniques. Understanding of network design processes, security tradeoffs, and enterprise architecture integration. Strong ability to interpret data from multiple tools to support security decision-making. Familiarity with impacts of cybersecurity lapses on data systems and operational environments. Kavaliro is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other characteristic protected by law.
    $85k-119k yearly est. 4d ago
  • Data scientist

    Smart It Frame LLC

    Data engineer job in Reston, VA

    Job title: Data scientist Fulltime About Smart IT Frame: At Smart IT Frame, we connect top talent with leading organizations across the USA. With over a decade of staffing excellence, we specialize in IT, healthcare, and professional roles, empowering both clients and candidates to grow together. Note: • In- person interview Must Have; • Data science • Python • SQL • ML/Ops • Risk Modelling 📩 Apply today or share profiles at ****************************
    $73k-102k yearly est. 5d ago
  • AWS Data Engineer

    Mindlance 4.6company rating

    Data engineer job in McLean, VA

    Responsibilities: Design, build, and maintain scalable data pipelines using AWS Glue and Databricks. Develop and optimize ETL/ELT processes using PySpark and Python. Collaborate with data scientists, analysts, and stakeholders to enable efficient data access and transformation. Implement and maintain data lake and warehouse solutions on AWS (S3, Glue Catalog, Redshift, Athena, etc.). Ensure data quality, consistency, and reliability across systems. Optimize performance of large-scale distributed data processing workflows. Develop automation scripts and frameworks for data ingestion, transformation, and validation. Follow best practices for data governance, security, and compliance. Required Skills & Experience: 5-8 years of hands-on experience in Data Engineering. Strong proficiency in Python and PySpark for data processing and transformation. Expertise in AWS services - particularly Glue, S3, Lambda, Redshift, and Athena. Hands-on experience with Databricks for building and managing data pipelines. Experience working with large-scale data systems and optimizing performance. Solid understanding of data modeling, data lake architecture, and ETL design principles. Strong problem-solving skills and ability to work independently in a fast-paced environment. “Mindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of - Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans.”
    $85k-113k yearly est. 5d ago
  • Data Engineer

    Sibitalent Corp

    Data engineer job in Richmond, VA

    Role - Data Engineer Work mode - Hybrid - This is a hybrid role (1 week onsite 1 week remote alternating) Duration - 12 months Must have skills - Talend, Oracle Exadata, Snowflake, Cloud Data Warehouse, Real-time Streaming, Kafka, Kinesis, ETL, Talend, Informatica, SQL, Stored Procedures, Table Design, SQL Query Optimization, ETL Performance Tuning, Data Loading, Oracle, Oracle Exadata. Required Proven experience with: • Experience with ETL tools - Talend (this is what the team uses) or Informatica • Experience with SQL, Stored Procedures and Table Design • Experience in SQL Query optimization and ETL Data loading performance • Experience in Snowflake Cloud Data warehouse strongly preferred • Experience in shell scripting is preferred • Experience in real time streaming technologies is preferred Looking for someone with experience in ETL tools. Dominion uses Talend, but experience with other tools like Informatica is fine. Strong SQL skills are needed for data loading, transformations, and query optimization. Experience with Oracle (Exadata) is required. While Dominion uses Snowflake, AWS or GCP experience is also okay. They are open to candidates with 5+ years of experience, as long as they're eager to learn. Thanks and Regards💕 Nikhil Technical Recruiter Email : ********************* Web: ****************** 101, E, Park Blvd.-Suite 600, Plano, TX 75074, USA Destiny hears - When you speak louder Note: SibiTalent Corp. is an equal opportunity staffing firm. We do not discriminate on the basis of race, caste, color, religion, gender, culture, visa status, or any other protected characteristic. All hiring decisions are made strictly based on qualifications, experience, and specific client requirements.
    $78k-106k yearly est. 4d ago
  • Business Data Management Architect

    Datastaff, Inc.

    Data engineer job in Richmond, VA

    DataStaff, Inc is in need of a Business Data Management Architect for a long-term contract opportunity with one of our direct clients located in Richmond, VA. *This role is hybrid Responsibilities: Participate in an enterprise data management program. Coordinate with business architecture, data architecture, enterprise architecture staff, data stewards, and data custodians. Leveraging expertise in data modeling and extensive data quality management to design and implement effective data management processes. Requires defining and utilizing taxonomies for enhanced data organization, classification, and retrieval, contributing to improved metadata management. Become familiar with Business Capability Model and participate in developing and maturing an enterprise data model, enterprise data flows, and road maps. This position will require familiarity (or the development of familiarity) with the National Information Exchange Model, the Spatial Data Standards for Facilities, Infrastructure, and Environment, and other standards. Knowledge and Experience: Minimum requirement: Must have prior Virginia Department of Transportation (VDOT) experience directly related to this role. Demonstrated experience with enterprise data programs at a similarly sized organization (private or public) Proven experience in data modeling Demonstrated ability to bridge the gap between business architecture and National Information Exchange Model (NIEM) standards Strong understanding of data governance principles and best practices Proficiency in metadata management, including taxonomies, and enhancing data quality Experience in overseeing the complete data lifecycle within a complex organizational structure Strong written and verbal communication Required Skills: 8 Years - Extensive data modeling experience 8 Years - Advanced business data architecture experience 8 Years - Proficiency in metadata management, including taxonomies, and enhancing data quality 5 Years - Ability to bridge the gap between business architecture and National Information Exchange Model (NIEM) standards 5 Years - Ability to model data lifecycle within a complex organizational structure This opportunity is available on a corp-to-corp basis or as a W2 position with a competitive benefits package. DataStaff, Inc. offers medical, dental, and vision coverage options as well as paid vacation, sick, and holiday leave. As many of our opportunities are long-term, we also have a 401k program available for employees after 6 months.
    $89k-123k yearly est. 4d ago
  • Cloud Data Engineer- Databricks

    Infocepts 3.7company rating

    Data engineer job in McLean, VA

    Purpose: We are seeking a highly skilled Cloud Data Engineer with deep expertise in Databricks and modern cloud platforms such as AWS, Azure, or GCP. This role is ideal for professionals who are passionate about building next-generation data platforms, optimizing complex data workflows, and enabling advanced analytics and AI in cloud-native environments. You'll have the opportunity to work with Fortune-500 organizations in data and analytics, helping them unlock the full potential of their data through innovative, scalable solutions. Key Result Areas and Activities: Design and implement robust, scalable data engineering solutions. Build and optimize data pipelines using Databricks, including serverless capabilities, Unity Catalog, and Mosaic AI. Collaborate with analytics and AI teams to enable real-time and batch data workflows. Support and improve cloud-native data platforms (AWS, Azure, GCP). Ensure adherence to best practices in data modeling, warehousing, and governance. Contribute to automation of data workflows using CI/CD, DevOps, or DataOps practices. Implement and maintain workflow orchestration tools like Apache Airflow and dbt. Roles & Responsibilities Essential Skills 4+ years of experience in data engineering with a focus on scalable solutions. Strong hands-on experience with Databricks in a cloud environment. Proficiency in Spark and Python for data processing. Solid understanding of data modeling, data warehousing, and architecture principles. Experience working with at least one major cloud provider (AWS, Azure, or GCP). Familiarity with CI/CD pipelines and data workflow automation. Desirable Skills Direct experience with Unity Catalog and Mosaic AI within Databricks. Working knowledge of DevOps/DataOps principles in a data engineering context. Exposure to Apache Airflow, dbt, and modern data orchestration frameworks. Qualifications Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field. Relevant certifications in cloud platforms (AWS/Azure/GCP) or Databricks are a plus. Qualities: Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team Able to iterate based on new information, peer reviews, and feedback Able to work seamlessly with clients across multiple geographies Research focused mindset Excellent analytical, presentation, reporting, documentation and interactive skills "Infocepts is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law."
    $77k-105k yearly est. 3d ago
  • Forward Deployed Software Engineer I

    Vantor

    Data engineer job in Herndon, VA

    Vantor is forging the new frontier of spatial intelligence, helping decision makers and operators navigate what's happening now and shape what's coming next. Vantor is a place for problem solvers, changemakers, and go-getters-where people are working together to help our customers see the world differently, and in doing so, be seen differently. Come be part of a mission, not just a job, where you can: Shape your own future, build the next big thing, and change the world. To be eligible for this position, you must be a U.S. Person, defined as a U.S. citizen, permanent resident, Asylee, or Refugee. Note on Cleared Roles: If this position requires an active U.S. Government security clearance, applicants who do not currently hold the required clearance will not be eligible for consideration. Employment for cleared roles is contingent upon verification of clearance status. Export Control/ITAR: Certain roles may be subject to U.S. export control laws, requiring U.S. person status as defined by 8 U.S.C. 1324b(a)(3). Please review the job details below. Vantor is seeking a mission driven Forward Deployed Software Engineer to support new US Army Programs with novel 3D software solutions. This individual will act as a technical bridge between our engineering teams and end users, integrating our software alongside partner capabilities, engaging with customer environments, learning the systems inside and out, and shaping real-world solutions in direct collaboration with the customer. You'll be hands-on with Vantor's advanced 3D capabilities, integrating and optimizing our technology in new ways to meet the fast-paced, dynamic needs of warfighters prototyping new Army functions. This role demands technical adaptability, strong communication skills, and a proactive mindset to solve challenges in the field and provide immediate feedback to internal teams. Responsibilities: Deploy to CONUS customer locations to support US Army operational needs, training, and mission exercises. Rapidly learn and troubleshoot the Vantor tech stack, with emphasis on geospatial platforms, data integration, and rapidly enabling end-user workflows. Serve as a liaison between Vantor's product teams and Army end users, providing real-time feedback and shaping development roadmaps. Collaborate with cross-functional teams (Product, Engineering, PMO) to deliver customer-specific configurations and technical solutions. Support fielding, onboarding, and adoption of new capabilities. Translate complex customer needs into actionable technical requirements. Provide technical demos, documentation, and hands-on training. Maintain a high standard of cybersecurity, data integrity, and operational discipline in line with DoD requirements. Minimum Qualifications: Bachelor's degree in Computer Science, Engineering, or related technical field (or equivalent experience). Secret Clearance (Ability to obtain TS/SCI). 2+ Years Relevant Experience. Proficient in one or more languages: Python, JavaScript, Go, C++, or similar. Comfort with Linux-based systems, cloud architectures, and containerized deployments (Docker, Kubernetes, etc.). Experience supporting or interacting with DoD programs, ideally in a forward or fielded capacity. Strong communication skills with ability to build trust across technical and non-technical stakeholders. Ability to travel up to 25-50% and support occasional after-hours mission requirements. Preferred Qualifications: Prior experience supporting Army or joint tactical missions. Familiarity with GIS, 3D terrain, or mission command platforms. Understanding of cybersecurity standards (e.g., RMF, CMMC). Current or prior military service or operational support background. Pay Transparency: In support of pay transparency at Vantor, we disclose salary ranges on all U.S. job postings. The successful candidate's starting pay will fall within the salary range provided below and is determined based on job-related factors, including, but not limited to, the experience, qualifications, knowledge, skills, geographic work location, and market conditions. Candidates with the minimum necessary experience, qualifications, knowledge, and skillsets for the position should not expect to receive the upper end of the pay range. ● The base pay for this position within the Washington, DC metropolitan area is: $90,000.00 - $150,000.00 annually. For all other states, we use geographic cost of labor as an input to develop market-driven ranges for our roles, and as such, each location where we hire may have a different range. Benefits: Vantor offers a competitive total rewards package that goes beyond the standard, including a robust 401(k) with company match, mental health resources, and unique perks like student loan repayment assistance, adoption reimbursement and pet insurance to support all aspects of your life. You can find more information on our benefits at: ****************************** The application window is three days from the date the job is posted and will remain posted until a qualified candidate has been identified for hire. If the job is reposted regardless of reason, it will remain posted three days from the date the job is reposted and will remain reposted until a qualified candidate has been identified for hire. The date of posting can be found on Vantor's Career page at the top of each job posting. To apply, submit your application via Vantor's Career page. EEO Policy: Vantor is an equal opportunity employer committed to an inclusive workplace. We believe in fostering an environment where all team members feel respected, valued, and encouraged to share their ideas. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, gender identity, sexual orientation, disability, protected veteran status, age, or any other characteristic protected by law.
    $90k-150k yearly 2d ago
  • Senior Data Engineer

    Zillion Technologies, Inc. 3.9company rating

    Data engineer job in McLean, VA

    The candidate must have 5+ years of hands on experience working with PySpark/Python, microservices architecture, AWS EKS, SQL, Postgres, DB2, Snowflake, Behave OR Cucumber frameworks, Pytest (unit testing), automation testing and regression testing. Experience with tools such as Jenkins, SonarQube AND/OR Fortify are preferred for this role. Experience in Angular and DevOps are nice to haves for this role. Must Have Qualifications: PySpark/Python based microservices, AWS EKS, Postgres SQL Database, Behave/Cucumber for automation, Pytest, Snowflake, Jenkins, SonarQube and Fortify. Responsibilities: Development of microservices based on Python, PySpark, AWS EKS, AWS Postgres for a data-oriented modernization project. New System: Python and PySpark, AWS Postgres DB, Behave/Cucumber for automation, and Pytest Perform System, functional and data analysis on the current system and create technical/functional requirement documents. Current System: Informatica, SAS, AutoSys, DB2 Write automated tests using Behave/cucumber, based on the new micro-services-based architecture Promote top code quality and solve issues related to performance tuning and scalability. Strong skills in DevOps, Docker/container-based deployments to AWS EKS using Jenkins and experience with SonarQube and Fortify. Able to communicate and engage with business teams and analyze the current business requirements (BRS documents) and create necessary data mappings. Preferred strong skills and experience in reporting applications development and data analysis Knowledge in Agile methodologies and technical documentation.
    $77k-109k yearly est. 2d ago
  • Lead Principal Data Solutions Architect

    Inadev

    Data engineer job in Reston, VA

    *****TO BE CONSIDERED, CANDIDATES MUST BE U.S. CITIZEN***** ***** TO BE CONSIDERED, CANDIDATES MUST BE LOCAL TO THE DC/MD/VA METRO AREA AND BE OPEN TO A HYBIRD SCHEDULE IN RESTON, VA***** Formed in 2011, Inadev is focused on its founding principle to build innovative customer-centric solutions incredibly fast, secure, and at scale. We deliver world-class digital experiences to some of the largest federal agencies and commercial companies. Our technical expertise and innovations are comprised of codeless automation, identity intelligence, immersive technology, artificial intelligence/machine learning (AI/ML), virtualization, and digital transformation. POSITION DESCRIPTION: Inadev is seeking a strong Lead Principal Data Solutions Architect Primary focus will be in Natural language processing (NLP), applying data mining techniques, doing statistical analysis and building high quality prediction systems. PROGRAM DESCRIPTION: This initiative focuses on modernizing and optimizing a mission-critical data environment within the immigration domain to enable advanced analytics and improved decision-making capabilities. The effort involves designing and implementing a scalable architecture that supports complex data integration, secure storage, and high-performance processing. The program emphasizes agility, innovation, and collaboration to deliver solutions that meet evolving stakeholder requirements while maintaining compliance with stringent security and governance standards. RESPONSIBILITES: Leading system architecture decisions, ensuring technical alignment across teams, and advocating for best practices in cloud and data engineering. Serve as a senior technical leader and trusted advisor, driving architectural strategy and guiding development teams through complex solution design and implementation Serve as the lead architect and technical authority for enterprise-scale data solutions, ensuring alignment with strategic objectives and technical standards. Drive system architecture design, including data modeling, integration patterns, and performance optimization for large-scale data warehouses. Provide expert guidance to development teams on Agile analytics methodologies and best practices for iterative delivery. Act as a trusted advisor and advocate for the government project lead, translating business needs into actionable technical strategies. Oversee technical execution across multiple teams, ensuring quality, scalability, and security compliance. Evaluate emerging technologies and recommend solutions that enhance system capabilities and operational efficiency. NON-TECHNICAL REQUIREMENTS: Must be a U.S. Citizen. Must be willing to work a HYRBID Schedule (2-3 Days) in Reston, VA & client locations in the Northern Virginia/DC/MD area as required. Ability to pass a 7-year background check and obtain/maintain a U.S. Government Clearance Strong communication and presentation skills. Must be able to prioritize and self-start. Must be adaptable/flexible as priorities shift. Must be enthusiastic and have passion for learning and constant improvement. Must be open to collaboration, feedback and client asks. Must enjoy working with a vibrant team of outgoing personalities. MANDATORY REQUIREMENTS/SKILLS: Bachelor of Science degree in Computer Science, Engineering or related subject and at least 10 years of experience leading architectural design of enterprise-level data platforms, with significant focus on Databricks Lakehouse architecture. Experience within the Federal Government, specifically DHS is preferred. Must possess demonstrable experience with Databricks Lakehouse Platform, including Delta Lake, Unity Catalog for data governance, Delta Sharing, and Databricks SQL for analytics and BI workloads. Must demonstrate deep expertise in Databricks Lakehouse architecture, medallion architecture (Bronze/Silver/Gold layers), Unity Catalog governance framework, and enterprise-level integration patterns using Databricks workflows and Auto Loader. Knowledge of and ability to organize technical execution of Agile Analytics using Databricks Repos, Jobs, and collaborative notebooks, proven by professional experience. Expertise in Apache Spark on Databricks, including performance optimization, cluster management, Photon engine utilization, and Delta Lake optimization techniques (Z-ordering, liquid clustering, data skipping). Proficiency in Databricks Unity Catalog for centralized data governance, metadata management, data lineage tracking, and access control across multi-cloud environments. Experience with Databricks Delta Live Tables (DLT) for declarative ETL pipeline development and data quality management. Certification in one or more: Databricks Certified Data Engineer Associate/Professional, Databricks Certified Solutions Architect, AWS, Apache Spark, or cloud platform certifications. DESIRED REQUIREMENTS/SKILLS: Expertise in ETL tools. Advanced knowledge of cloud platforms (AWS preferred; Azure or GCP a plus). Proficiency in SQL, PL/SQL, and performance tuning for large datasets. Understanding of security frameworks and compliance standards in federal environments. PHYSICAL DEMANDS: Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions Inadev Corporation does not discriminate against qualified individuals based on their status as protected veterans or individuals with disabilities and prohibits discrimination against all individuals based on their race, color, religion, sex, sexual orientation/gender identity, or national origin.
    $84k-115k yearly est. 5d ago
  • Lead Data Engineer

    Capital One 4.7company rating

    Data engineer job in Williamsburg, VA

    Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative,inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Lead Data Engineer, you'll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You'll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications: Bachelor's Degree At least 4 years of experience in application development (Internship experience does not apply) At least 2 years of experience in big data technologies At least 1 year experience with cloud computing (AWS, Microsoft Azure, Google Cloud) Preferred Qualifications: 7+ years of experience in application development including Python, SQL, Scala, or Java 4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 4+ year experience working on real-time data and streaming applications 4+ years of experience with NoSQL implementation (Mongo, Cassandra) 4+ years of data warehousing experience (Redshift or Snowflake) 4+ years of experience with UNIX/Linux including basic commands and shell scripting 2+ years of experience with Agile engineering practices At this time, Capital One will not sponsor a new applicant for employment authorization for this position. The minimum and maximum full-time annual salaries for this role are listed below, by location. Please note that this salary information is solely for candidates hired to perform work within one of these locations, and refers to the amount Capital One is willing to pay at the time of this posting. Salaries for part-time roles will be prorated based upon the agreed upon number of hours to be regularly worked. McLean, VA: $193,400 - $220,700 for Lead Data EngineerRichmond, VA: $175,800 - $200,700 for Lead Data Engineer Candidates hired to work in other locations will be subject to the pay range associated with that location, and the actual annualized salary amount offered to any candidate at the time of hire will be reflected solely in the candidate's offer letter. This role is also eligible to earn performance based incentive compensation, which may include cash bonus(es) and/or long term incentives (LTI). Incentives could be discretionary or non discretionary depending on the plan. Capital One offers a comprehensive, competitive, and inclusive set of health, financial and other benefits that support your total well-being. Learn more at the Capital One Careers website. Eligibility varies based on full or part-time status, exempt or non-exempt status, and management level. This role is expected to accept applications for a minimum of 5 business days.No agencies please. Capital One is an equal opportunity employer committed to diversity and inclusion in the workplace. All qualified applicants will receive consideration for employment without regard to sex (including pregnancy, childbirth or related medical conditions), race, color, age, national origin, religion, disability, genetic information, marital status, sexual orientation, gender identity, gender reassignment, citizenship, immigration status, protected veteran status, or any other basis prohibited under applicable federal, state or local law. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City's Fair Chance Act; Philadelphia's Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at ************** or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to ********************** Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).
    $76k-98k yearly est. 1d ago
  • Senior Data Engineer.

    Pyramid Consulting, Inc. 4.1company rating

    Data engineer job in McLean, VA

    Immediate need for a talented Senior Data Engineer. This is a 06+months contract opportunity with long-term potential and is located in Mclean, VA(Remote). Please review the job description below and contact me ASAP if you are interested. Job ID: 25-84666 Pay Range: $64 - $68/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location). Key Responsibilities: Demonstrate ability in implementing data warehouse solutions using modern data platforms such as Client, Databricks or Redshift. Build data integration solutions between transaction systems and analytics platforms. Expand data integration solutions to ingest data from internal and external sources and to further transform as per the business consumption needs. Develop tasks for a multitude of data patterns, e.g., real-time data integration, advanced analytics, machine learning, BI and reporting. Fundamental understanding of building of data products by data enrichment and ML. Act as a team player and share knowledge with the existing team members. Key Requirements and Technology Experience: Key skills; Python, AWS, SNOWFLAKE Bachelor's degree in computer science or a related field. Minimum 5 years of experience in building data driven solutions. At least 3 years of experience working with AWS services. Applicants must be authorized to work in the US without requiring employer sponsorship currently or in the future. U.S. FinTech does not offer H-1B sponsorship for this position. Expertise in real-time data solutions, good-to-have knowledge of streams processing, Message Oriented Platforms and ETL/ELT Tools. Strong scripting experience using Python and SQL. Working knowledge of foundational AWS compute, storage, networking and IAM. Understanding of Gen AI models, prompt engineering, RAG, fine tuning and pre-tuning will be a plus. Solid scripting experience in AWS using Lambda functions. Knowledge of CloudFormation template preferred. Hands-on experience with popular cloud-based data warehouse platforms such as Redshift and Client. Experience in building data pipelines with related understanding of data ingestion, transformation of structured, semi-structured and unstructured data across cloud services. Knowledge and understanding of data standards and principles to drive best practices around data management activities and solutions. Experience with one or more data integration tools such as Attunity (Qlik), AWS Glue ETL, Talend, Kafka etc. Strong understanding of data security - authorization, authentication, encryption, and network security. Hands on experience in using and extending machine learning framework and libraries, e.g, scikit-learn, PyTorch, TensorFlow, XGBoost etc. preferred. Experience with AWS SageMaker family of services or similar tools to develop machine learning models preferred. Strong written and verbal communication skills to facilitate meetings and workshops to collect data, functional and technology requirements, document processes, data flows, gap analysis, and associated data to support data management/governance related efforts. Acts with integrity and proactively seeks ways to ensure compliance with regulations, policies, and procedures. Demonstrated ability to be self-directed with excellent organization, analytical and interpersonal skills, and consistently meet or exceed deadline deliverables. Strong understanding of the importance and benefits of good data quality, and the ability to champion results across functions. Our client is a leading Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration. Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
    $64-68 hourly 3d ago
  • Cloud Data Architect

    Infocepts 3.7company rating

    Data engineer job in McLean, VA

    Purpose: As a Cloud Data Architect, you'll be at the forefront of innovation - guiding clients and teams through the design and implementation of cutting-edge solutions using Databricks, modern data platforms, and cloud-native technologies. In this role, you won't just architect solutions -you'll help grow a thriving Analytics & Data Management practice, act as a trusted Databricks SME, and bring a business-first mindset to every challenge. You'll have the opportunity to lead delivery efforts, build transformative data solutions, and cultivate strategic relationships with Fortune-500 organizations. Key Result Areas and Activities: Architect and deliver scalable, cloud-native data solutions across various industries. Lead data strategy workshops and AI/ML readiness assessments. Develop solution blueprints leveraging Databricks (Lakehouse, Delta Lake, MLflow, Unity Catalog). Conduct architecture reviews and build proof-of-concept (PoC) prototypes on platforms like Databricks, AWS, Azure, and Snowflake. Engage with stakeholders to define and align future-state data strategies with business outcomes. Mentor and lead data engineering and architecture teams. Drive innovation and thought leadership across client engagements and internal practice areas. Promote FinOps practices, ensuring cost optimization within multi-cloud deployments. Support client relationship management and engagement expansion through consulting excellence. Roles & Responsibilities Essential Skills: 10+ years of experience designing and delivering scalable data architecture and solutions. 5+ years in consulting, with demonstrated client-facing leadership. Expertise in Databricks ecosystem including Delta Lake, Lakehouse, Unity Catalog, and MLflow. Strong hands-on knowledge of cloud platforms (Azure, AWS, Databricks, and Snowflake). Proficiency in Spark and Python for data engineering and processing tasks. Solid grasp of enterprise data architecture frameworks such as TOGAF and DAMA. Demonstrated ability to lead and mentor teams, manage multiple projects, and drive delivery excellence. Excellent communication skills with proven ability to consult and influence executive stakeholders. Desirable Skills Recognized thought leadership in emerging data and AI technologies. Experience with FinOps in multi-cloud environments, particularly with Databricks and AWS cost optimization. Familiarity with data governance and data quality best practices at the enterprise level. Knowledge of DevOps and MLOps pipelines in cloud environments. Qualifications: Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or related fields. Professional certifications in Databricks, AWS, Azure, or Snowflake preferred. TOGAF, DAMA, or other architecture framework certifications are a plus. Qualities: Self-motivated and focused on delivering outcomes for a fast growing team and firm Able to communicate persuasively through speaking, writing, and client presentations Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team Able to iterate based on new information, peer reviews, and feedback Able to work with teams and clients in different time zones Research focused mindset "Infocepts is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law."
    $87k-121k yearly est. 3d ago

Learn more about data engineer jobs

Do you work as a data engineer?

What are the top employers for data engineer in VA?

Top 10 Data Engineer companies in VA

  1. Capital One

  2. Booz Allen Hamilton

  3. Accenture

  4. CapTech

  5. Ernst & Young

  6. Leidos

  7. Guidehouse

  8. Amazon

  9. Infinitive

  10. USM Business Systems

Job type you want
Full Time
Part Time
Internship
Temporary

Browse data engineer jobs in virginia by city

All data engineer jobs

Jobs in Virginia