Data Engineer
Data engineer job in McLean, VA
Immediate need for a talented Data Engineer. This is a 12 months contract opportunity with long-term potential and is located in Mclean, VA(Hybrid). Please review the job description below and contact me ASAP if you are interested.
Job ID: 25-93504
Pay Range: $70 - $75/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
Design, develop, and maintain data pipelines leveraging Python, Spark/PySpark, and cloud-native services.
Build and optimize data workflows, ETL processes, and transformations for large-scale structured and semi-structured datasets.
Write advanced and efficient SQL queries against Snowflake, including joins, window functions, and performance tuning.
Develop backend and automation tools using Golang and/or Python as needed.
Implement scalable, secure, and high-quality data solutions across AWS services such as S3, Lambda, Glue, Step Functions, EMR, and CloudWatch.
Troubleshoot complex production data issues, including pipeline failures, data quality gaps, and cloud environment challenges.
Perform root-cause analysis and implement automation to prevent recurring issues.
Collaborate with data scientists, analysts, platform engineers, and product teams to enable reliable, high-quality data access.
Ensure compliance with enterprise governance, data quality, and cloud security standards.
Participate in Agile ceremonies, code reviews, and DevOps practices to ensure high engineering quality.
Key Requirements and Technology Experience:
Skills-Data Engineer- Python , Spark/PySpark, AWS, Golang, Able to write complex SQL queries against Snowflake tables / Troubleshoot issues, Java/Python, AWS (Glue, EC2, Lambda).
Proficiency in Python with experience building scalable data pipelines or ETL processes.
Strong hands-on experience with Spark/PySpark for distributed data processing.
Experience writing complex SQL queries (Snowflake preferred), including optimization and performance tuning.
Working knowledge of AWS cloud services used in data engineering (S3, Glue, Lambda, EMR, Step Functions, CloudWatch, IAM).
Experience with Golang for scripting, backend services, or performance-critical processes.
Strong debugging, troubleshooting, and analytical skills across cloud and data ecosystems.
Familiarity with CI/CD workflows, Git, and automated testing.
Our client is a leading Banking and Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Data Scientist with ML
Data engineer job in Reston, VA
Kavaliro is seeking a Data Scientist to provide highly technical and in-depth data engineering support.
MUST have experience with Python, PyTorch, Flask (knowledge at minimum with ability to quickly pickup), Familiarity with REST APIs (at minimum), Statistics background/experience, Basic understanding of NLP.
Desired skills for a candidate include experience performance R&D with natural language processing, deploying CNN and LLMs or foundational models, deploying ML models on multimedia data, experience with Linux System Administration (or bash), experience with Android Configuration, experience in embedded systems (Raspberry Pi).
Required Skills and Demonstrated Experience
Demonstrated experience in Python, Javascript, and R.
Demonstrated experience employing machine learning and deep learning modules such as Pandas, Scikit, Tensorflow, Pytorch.
Demonstrated experience with statistical inference, as well as building and understanding predictive models, using machine learning methods.
Demonstrated experience with large-scale text analytics.
Desired Skills
Demonstrated hands-on experience performing research or development with natural language processing and working with, deploying, and testing Convolutional Neural Networks (CNN), large-language models (LLMs) or foundational models.
Demonstrated experience developing and deploying testing and verification methodologies to evaluate algorithm performance and identify strategies for improvement or optimization.
Demonstrated experience deploying machine learning models on multimedia data, to include joint text, audio, video, hardware, and peripherals.
Demonstrated experience with Linux System Administration and associated scripting languages (Bash)
Demonstrated experience with Android configuration, software development, and interfacing.
Demonstrated experience in embedded systems (Raspberry Pi)
Develops and conducts independent testing and evaluation methods on research-grade algorithms in applicable fields.
Reports results and provide documentation and guidance on working with the research-grade algorithms.
Evaluates, Integrates and leverage internally-hosted data science tools.
Customize research grade algorithms to be optimized for memory and computational efficiency through quantizing, trimming layers, or through custom methods
Location:
Reston, Virginia
This position is onsite and there is no remote availability.
Clearance:
Active TS/SCI with Full Scope Polygraph
Applicant MUST hold a permanent U.S. citizenship for this position in accordance with government contract requirements.
Kavaliro provides Equal Employment Opportunities to all employees and applicants. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Kavaliro is committed to the full inclusion of all qualified individuals. In keeping with our commitment, Kavaliro will take the steps to assure that people with disabilities are provided reasonable accommodations. Accordingly, if reasonable accommodation is required to fully participate in the job application or interview process, to perform the essential functions of the position, and/or to receive all other benefits and privileges of employment, please respond to this posting to connect with a company representative.
Data Scientist
Data engineer job in Columbia, MD
Data Scientist - Transit Data Focus_Columbia, MD (On-site / hybrid)_Contract (6 Months)
Data Scientist - Transit Data Focus
Employment type: Contract
Duration: 6 Months
Justification: To manage and analyze customer databases, AVA (automated voice announcement), and schedule data for predictive maintenance and service planning.
Experience Level: 3-5 years
Job Responsibilities:
Collect, process, and analyze transit-related datasets including customer databases, AVA (automated voice announcement) logs, real-time vehicle data, and schedule data.
Develop predictive models and data-driven insights to support maintenance forecasting, service planning, and operational optimization.
Design and implement data pipelines to integrate, clean, and transform large, heterogeneous transit data sources.
Perform statistical analysis and machine learning to identify patterns, trends, and anomalies relevant to transit service performance and reliability.
Collaborate with transit planners, maintenance teams, and IT staff to translate data insights into actionable business strategies.
Monitor data quality and integrity; implement data validation and cleansing processes.
Technical Skills & Qualifications:
Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Transportation Engineering, or a related quantitative field.
3-5 years of experience working as a data scientist or data analyst, preferably in a transit, transportation, or public sector environment.
Strong proficiency in Python or R for data analysis, statistical modeling, and machine learning.
Experience with SQL for database querying, manipulation, and data extraction.
Familiarity with transit data standards such as GTFS, AVL/CAD, APC (Automated Passenger Counters), and AVA systems.
Experience with data visualization tools such as Power BI, or equivalent.
Cloud Data Engineer- Databricks
Data engineer job in McLean, VA
Purpose:
We are seeking a highly skilled Cloud Data Engineer with deep expertise in Databricks and modern cloud platforms such as AWS, Azure, or GCP. This role is ideal for professionals who are passionate about building next-generation data platforms, optimizing complex data workflows, and enabling advanced analytics and AI in cloud-native environments. You'll have the opportunity to work with Fortune-500 organizations in data and analytics, helping them unlock the full potential of their data through innovative, scalable solutions.
Key Result Areas and Activities:
Design and implement robust, scalable data engineering solutions.
Build and optimize data pipelines using Databricks, including serverless capabilities, Unity Catalog, and Mosaic AI.
Collaborate with analytics and AI teams to enable real-time and batch data workflows.
Support and improve cloud-native data platforms (AWS, Azure, GCP).
Ensure adherence to best practices in data modeling, warehousing, and governance.
Contribute to automation of data workflows using CI/CD, DevOps, or DataOps practices.
Implement and maintain workflow orchestration tools like Apache Airflow and dbt.
Roles & Responsibilities
Essential Skills
4+ years of experience in data engineering with a focus on scalable solutions.
Strong hands-on experience with Databricks in a cloud environment.
Proficiency in Spark and Python for data processing.
Solid understanding of data modeling, data warehousing, and architecture principles.
Experience working with at least one major cloud provider (AWS, Azure, or GCP).
Familiarity with CI/CD pipelines and data workflow automation.
Desirable Skills
Direct experience with Unity Catalog and Mosaic AI within Databricks.
Working knowledge of DevOps/DataOps principles in a data engineering context.
Exposure to Apache Airflow, dbt, and modern data orchestration frameworks.
Qualifications
Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field.
Relevant certifications in cloud platforms (AWS/Azure/GCP) or Databricks are a plus.
Qualities:
Able to consult, write, and present persuasively
Able to work in a self-organized and cross-functional team
Able to iterate based on new information, peer reviews, and feedback
Able to work seamlessly with clients across multiple geographies
Research focused mindset
Excellent analytical, presentation, reporting, documentation and interactive skills
"Infocepts is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law."
Data Architect
Data engineer job in Arlington, VA
• Functions as the primary technical architect for data warehousing projects to solve business intelligence challenges
• Possesses deep technical expertise in database design, ETL (OWB/ODI), reporting, and analytics
• Previous consulting experience utilizing an agile delivery methodology
Position Requirements
• Solutions architect must have expertise as both a solutions architect and AI architect.
• 3+ years experience with Azure ETL processing
• 3+ years experience utilizing data warehousing methodologies and processes
• Strong conceptual, analytical, and decision-making skills
• Knowledge and Experience of dimensional modeling
• Strong knowledge of Azure Databricks
• Proficiency in creating PL/SQL packages
• Full SDLC and Data Modeling experience
• Ability to create both logical and physical data models
• Ability to tune databases for maximum performance
• Experience in Data Preparation: Data Profiling, Data Cleansing, and Data Auditing
• Ability to work with Business Analysts to create functional specifications and data
• Manages QA functions
• Develops unit, system, and integration test plans and manages execution
• Ability to write technical and end-user system documentation
• Excellent written and oral communication skills
• Experience transforming logical business requirements into appropriate schemas and models
• Ability to analyze and evaluate moderate to highly complex information systems by being able to interpret such devices as Entity Relation Diagrams, data dictionaries, record layouts, and logic flow diagrams
Senior Data Engineer
Data engineer job in McLean, VA
The candidate must have 5+ years of hands on experience working with PySpark/Python, microservices architecture, AWS EKS, SQL, Postgres, DB2, Snowflake, Behave OR Cucumber frameworks, Pytest (unit testing), automation testing and regression testing.
Experience with tools such as Jenkins, SonarQube AND/OR Fortify are preferred for this role.
Experience in Angular and DevOps are nice to haves for this role.
Must Have Qualifications: PySpark/Python based microservices, AWS EKS, Postgres SQL Database, Behave/Cucumber for automation, Pytest, Snowflake, Jenkins, SonarQube and Fortify.
Responsibilities:
Development of microservices based on Python, PySpark, AWS EKS, AWS Postgres for a data-oriented modernization project.
New System: Python and PySpark, AWS Postgres DB, Behave/Cucumber for automation, and Pytest
Perform System, functional and data analysis on the current system and create technical/functional requirement documents.
Current System: Informatica, SAS, AutoSys, DB2
Write automated tests using Behave/cucumber, based on the new micro-services-based architecture
Promote top code quality and solve issues related to performance tuning and scalability.
Strong skills in DevOps, Docker/container-based deployments to AWS EKS using Jenkins and experience with SonarQube and Fortify.
Able to communicate and engage with business teams and analyze the current business requirements (BRS documents) and create necessary data mappings.
Preferred strong skills and experience in reporting applications development and data analysis
Knowledge in Agile methodologies and technical documentation.
Lead Principal Data Solutions Architect
Data engineer job in Reston, VA
*****TO BE CONSIDERED, CANDIDATES MUST BE U.S. CITIZEN*****
***** TO BE CONSIDERED, CANDIDATES MUST BE LOCAL TO THE DC/MD/VA METRO AREA AND BE OPEN TO A HYBIRD SCHEDULE IN RESTON, VA*****
Formed in 2011, Inadev is focused on its founding principle to build innovative customer-centric solutions incredibly fast, secure, and at scale. We deliver world-class digital experiences to some of the largest federal agencies and commercial companies. Our technical expertise and innovations are comprised of codeless automation, identity intelligence, immersive technology, artificial intelligence/machine learning (AI/ML), virtualization, and digital transformation.
POSITION DESCRIPTION:
Inadev is seeking a strong Lead Principal Data Solutions Architect Primary focus will be in Natural language processing (NLP), applying data mining techniques, doing statistical analysis and building high quality prediction systems.
PROGRAM DESCRIPTION:
This initiative focuses on modernizing and optimizing a mission-critical data environment within the immigration domain to enable advanced analytics and improved decision-making capabilities. The effort involves designing and implementing a scalable architecture that supports complex data integration, secure storage, and high-performance processing. The program emphasizes agility, innovation, and collaboration to deliver solutions that meet evolving stakeholder requirements while maintaining compliance with stringent security and governance standards.
RESPONSIBILITES:
Leading system architecture decisions, ensuring technical alignment across teams, and advocating for best practices in cloud and data engineering.
Serve as a senior technical leader and trusted advisor, driving architectural strategy and guiding development teams through complex solution design and implementation
Serve as the lead architect and technical authority for enterprise-scale data solutions, ensuring alignment with strategic objectives and technical standards.
Drive system architecture design, including data modeling, integration patterns, and performance optimization for large-scale data warehouses.
Provide expert guidance to development teams on Agile analytics methodologies and best practices for iterative delivery.
Act as a trusted advisor and advocate for the government project lead, translating business needs into actionable technical strategies.
Oversee technical execution across multiple teams, ensuring quality, scalability, and security compliance.
Evaluate emerging technologies and recommend solutions that enhance system capabilities and operational efficiency.
NON-TECHNICAL REQUIREMENTS:
Must be a U.S. Citizen.
Must be willing to work a HYRBID Schedule (2-3 Days) in Reston, VA & client locations in the Northern Virginia/DC/MD area as required.
Ability to pass a 7-year background check and obtain/maintain a U.S. Government Clearance
Strong communication and presentation skills.
Must be able to prioritize and self-start.
Must be adaptable/flexible as priorities shift.
Must be enthusiastic and have passion for learning and constant improvement.
Must be open to collaboration, feedback and client asks.
Must enjoy working with a vibrant team of outgoing personalities.
MANDATORY REQUIREMENTS/SKILLS:
Bachelor of Science degree in Computer Science, Engineering or related subject and at least 10 years of experience leading architectural design of enterprise-level data platforms, with significant focus on Databricks Lakehouse architecture.
Experience within the Federal Government, specifically DHS is preferred.
Must possess demonstrable experience with Databricks Lakehouse Platform, including Delta Lake, Unity Catalog for data governance, Delta Sharing, and Databricks SQL for analytics and BI workloads.
Must demonstrate deep expertise in Databricks Lakehouse architecture, medallion architecture (Bronze/Silver/Gold layers), Unity Catalog governance framework, and enterprise-level integration patterns using Databricks workflows and Auto Loader.
Knowledge of and ability to organize technical execution of Agile Analytics using Databricks Repos, Jobs, and collaborative notebooks, proven by professional experience.
Expertise in Apache Spark on Databricks, including performance optimization, cluster management, Photon engine utilization, and Delta Lake optimization techniques (Z-ordering, liquid clustering, data skipping).
Proficiency in Databricks Unity Catalog for centralized data governance, metadata management, data lineage tracking, and access control across multi-cloud environments.
Experience with Databricks Delta Live Tables (DLT) for declarative ETL pipeline development and data quality management.
Certification in one or more: Databricks Certified Data Engineer Associate/Professional, Databricks Certified Solutions Architect, AWS, Apache Spark, or cloud platform certifications.
DESIRED REQUIREMENTS/SKILLS:
Expertise in ETL tools.
Advanced knowledge of cloud platforms (AWS preferred; Azure or GCP a plus).
Proficiency in SQL, PL/SQL, and performance tuning for large datasets.
Understanding of security frameworks and compliance standards in federal environments.
PHYSICAL DEMANDS:
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions
Inadev Corporation does not discriminate against qualified individuals based on their status as protected veterans or individuals with disabilities and prohibits discrimination against all individuals based on their race, color, religion, sex, sexual orientation/gender identity, or national origin.
DevOps Engineer
Data engineer job in McLean, VA
The candidate should be able to drive implementation and improvement of tools and technologies for enterprise adoption in accordance with operational and security standards.
Practice and promote a Site Reliability Engineering (SRE) culture to improve and operate cloud platform offerings to the
enterprise while working toward innovation, automation, and operational excellence.
Automation experience is a must for this position.
Ability to provide 24x7 operational support on a periodic basis and involvement in Issue resolution is a must.
Must Have Qualifications:
Must have 5+ years of have on experience with AWS CloudFormation and Terraform. Automation through Shell Scripting and Python required (Ansible nice to have). 3+ years of experience with EKS and Kubernetes
Technical expertise:
7+ years of overall information technology experience with an emphasis on integration and delivery of virtual/cloud platforms to enterprise applications.
At least 5 years of proven experience with AWS CloudFormation, Terraform, or similar tools.
3+ years of experience with engineering and supporting containerization technology (OpenShift, Kubernetes, AWS(ECS/EKS), etc.) at scale.
Experience in Python, Ansible and shell scripting to automate routine operation tasks.
Experience in Tetrate, Rancher, ArgoCD are highly preferred.
About US Tech Solutions:
US Tech Solutions is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit ***********************
US Tech Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Recruiter Details:
Aishwarya Chandra
Email: ****************************************
Job ID: 25-53450
Senior DevOps Engineer
Data engineer job in Columbia, MD
Veteran-Owned Firm Seeking a Senior DevOps Engineer with TS/SCI with a Full Scope Polygraph for an Onsite Role in Columbia, MD.
My name is Stephen Hrutka. I lead a Veteran-Owned management consulting firm in Washington, DC. We specialize in Technical and Cleared Recruiting for the Department of Defense (DoD), the Intelligence Community (IC), and other advanced defense agencies.
At HRUCKUS, we support fellow Veteran-Owned businesses by helping them recruit for positions across organizations such as the VA, SBA, HHS, DARPA, and other leading-edge R&D-focused defense agencies.
We seek to fill a Senior DevOps Engineer role in Columbia, MD.
The ideal candidate is a Columbia, MD resident with an active TS/SCI with a Full Scope Polygraph clearance, over 17 years of experience in DevOps, and expertise in Git, platform architecture, automation, and managing cloud infrastructure (AWS/PaaS) across both Linux and Windows environments. A minimum of five years' experience in Agile development and hands-on management of AWS/PaaS is essential.
If you're interested, I'll gladly provide more details about the role and discuss your qualifications further.
Thanks,
Stephen M Hrutka
Principal Consultant
HRUCKUS LLC
Executive Summary: HRUCKUS is looking for an experienced Senior DevOps Engineer to support a mission-critical program for the Warfighter within the National Security Sector. The scope includes platform design, building and maintaining secure release pipelines, automating deployments, and optimizing system infrastructure.
Position Description: The Senior DevOps Engineer will be responsible for designing, implementing, and maintaining the infrastructure and tools that enable our software development teams to build, test, and release software faster and more reliably. This role involves working with a variety of technologies, including cloud platforms, containerization, and configuration management tools. The ideal candidate will have a deep understanding of CI/CD principles and a passion for automation.
Position Responsibilities:
Support the development lifecycle, including platform design, deployment, and debugging.
Build and maintain a release pipeline for fast, secure delivery to production.
Automate deployments across environments using scripting languages and toolkits.
Configure sites/applications via tools like Puppet and Ansible, and maintain Confluence/Jira software.
Assist in designing/maintaining web service infrastructure and deployments.
Identify and implement process improvements through automation and streamlining.
Required Skills:
Must have an active TS/SCI with Full Scope Polygraph clearance.
Bachelor's degree with a minimum of 17 years of relevant experience, OR a Master's degree with 15 years of relevant experience
5+ years in Agile development environments.
5+ years managing AWS cloud or virtualized servers in PaaS environments.
Fluent with Git and version control best practices.
Strong knowledge of Linux environments (RHEL 6/7/8, CentOS) and Windows system administration.
Expertise with caching technologies (e.g., Memcache, Active MQ, Redis, APC), MySQL, and Elasticsearch.
Proficiency in at least one programming language (Ruby, C/C++, Go, Python, or Java).
Familiarity with security practices, networking protocols, firewalls, and PCI compliance.
Desired Skills:
Experience integrating Jenkins/Bamboo, Docker, and Kubernetes for automated deployment.
Kibana a plus.
Details:
Job Title: Senior DevOps Engineer
Location: Columbia, MD 21045
Work Arrangement: Full Time, Onsite
Clearance Requirement: Active TS/SCI with a Full Scope Polygraph
Compensation: $148,000 to $269,000 per year.
Please note: Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.
We offer competitive benefits, including: Paid Time Off, 11 Paid Holidays, 401K with a 6% company match and immediate vesting, Flexible Schedules, Discounted Stock Purchase Plans, Technical Upskilling, Education and Training Support and Parental Paid Leave
AWS DevOps Engineer
Data engineer job in Reston, VA
fAbout the Company
The Data Engineering and Advanced Analytics Enablement team is seeking an experienced AWS DevOps Engineer to lead the enablement of analytics tools on a cloud-native architecture.
About the Role
This role will design, build, and maintain infrastructure that supports next-generation analytics platforms, leveraging best practices in Infrastructure as Code (IaC), high availability, fault tolerance, and operational excellence for COTS deployments in AWS. Expertise in Amazon EKS, CI/CD automation, and scripting is essential.
Responsibilities
Drive technical requirements gathering and solution design sessions with engineers, architects, and product managers in an Agile environment.
Design, build, and deploy infrastructure for analytics COTS tools (e.g., Posit/RStudio, Anaconda Distribution) on AWS.
Develop and automate scripts for deploying and configuring software components, from large-scale COTS products to custom microservices.
Implement Infrastructure as Code using AWS services and tools such as Terraform, CloudFormation, CodePipeline, Lambda (R/Python), CloudWatch, Route53, S3, and more.
Architect and manage Amazon EKS clusters for containerized workloads, ensuring scalability and security.
Continuously improve infrastructure provisioning automation, CI/CD pipelines, and operational excellence in the cloud.
Monitor and optimize system performance in AWS for reliability, cost efficiency, and resilience.
Conduct peer reviews and enhance existing infrastructure for scalability and fault tolerance.
Participate in on-call rotations for critical outages and incidents.
Maintain comprehensive technical documentation, including system diagrams and operational procedures.
Qualifications
7+ years of IT experience, including software development and DevOps functions.
5+ years of experience building and maintaining CI/CD tooling (GitLab Pipelines, Jenkins Pipelines, Bitbucket, GitHub) and creating/extending CI/CD environments via Terraform and CloudFormation.
3+ years of production experience with core AWS services (EKS, EC2, S3, RDS, API Gateway, ALB, ELB, Lambda, etc.).
3+ years of hands-on experience with Amazon EKS and container orchestration.
3+ years of Unix/Linux system administration experience.
Proficiency in Python (preferred) and R.
Strong automation scripting skills in Bash, Shell, Python, and familiarity with Java, JavaScript, Ansible, Perl.
Experience supporting web technologies and websites running Apache or NGINX.
Familiarity with open-source web service environments (Java, REST, SOAP).
Working experience with Confluence, Jira SaaS, SharePoint, others.
Required Skills
Deep understanding of the AWS Well-Architected Framework.
Strong analytical, organizational, and problem-solving skills.
Excellent verbal and written communication abilities.
Effective teamwork, planning, and coordination skills.
Self-motivated, adaptable, and capable of meeting aggressive deadlines.
Ability to independently research and resolve technical challenges in complex IT environments.
DevOps Engineer
Data engineer job in Reston, VA
TechTrend is seeking a DevOps engineer to join our team of talented staff. Using a modern technology stack, AI and agile approach, in this role you will have the opportunity to be a part of a workforce that is transforming IT through technology modernization efforts. Be a member of an environment that thrives on innovation and ideas of all members of the team. Our environment is highly collaborative and allows you the opportunity to continue building your skills through training and multiple diverse projects.
Duties and Responsibilities:
· Install and maintain CI/CD tools on Azure
· Patch and upgrade operating systems and applications to remediate vulnerabilities and enable new features.
· Troubleshoot performance issues and resolve tickets to meet service level agreements
· Assist in the design, development, deployment, and maintenance of cloud infrastructure automation solutions using Terraform and Ansible.
· Experience developing containerized applications using Docker and Kubernetes.
· Demonstrated experience with SonarQube is a plus.
· Working closely with both the development and operations group and having a good scripting or programming background.
· Comfortable working independently to manage projects and tasks to completion.
· Good knowledge on Unix operating system (Ubuntu) and Windows Servers.
· Demonstrated ability to write at least one scripting language and/or one programming language such as Shell, Python, PowerShell or other equivalent.
· Experience in deploying and managing resources on Azure (prefer) and GCP.
Other Qualifications
· Bachelor's degree in IT related field
· 3+ years in DevOps. DevOps tools in PaaS and/or IaaS is a plus
· 2+ Systems administration experience
· Azure certification (preferably in DevOps)
· Experience with one or more of these CI/CD tools: GitHub, Jenkins, SonarQube, Jira, Jfrog, Docker, Ansible, ArgoCD, Atlassian suite, etc.
· Familiarity with FedRAMP and CMMC to ensure the environment is compliance with NIST 800-53 Rev 5.
· Ability to diagram and develop documentation of the environment, CI/CD tools and other configuration items
Job Location
Hybrid (3 times a week)- Must live within commutable distance to Reston, VA
Competitive Benefits:
Medical, Dental & Vision coverage
Life Insurance
Short, Long Term Disability Insurance
PTO & Federal Holidays Off
401(k) Plan (Matching component included)
About TechTrend
TechTrend, Inc. is a veteran-friendly small business providing expert solutions, products, and services to the Federal government. Founded in 2003, we continue to evolve with capabilities in application development, artificial intelligence, DevSecOps, cloud enablement, and application development. We are a Microsoft Gold Partner and leading provider of Azure cloud services, to include partnership with AWS and GCP. TechTrend is recognized as a trusted partner delivering knowledge and guidance for our client's most critical and complex support and service needs. As a liaison for positive organizational change, we form relationships and build bridges while ensuring quality across functions-gaining buy-in from both leaders and end-users and removing barriers to mission success. Our established processes ensure quality delivery of results by maximizing efficiency, productivity, and client satisfaction enterprise wide.
UiPath Engineer
Data engineer job in McLean, VA
Need Only Local Candidates from Nearby Area.
Top Skills Must:
UiPath
Document Understanding
Python
Developer Role and Responsibilities
Your specific duties will be based on your experience as an UiPath developer. In this role, you will be responsible for designing and delivering UiPath solutions in accordance with WonderBotz standards and best practices. You will work closely together with our enthusiastic team of both business and technical specialists. You will be part of a fast-growing and successful team that helps our clients get the maximum benefit.
Expected Activities:
• Support development of UiPath strategies, including assessing opportunities
• Under the supervision of more experienced developers, define, design, and develop automation on UiPath platforms for clients, including POCs, pilots, and production automation. More senior developers will be expected to work independently
• Participate in workshops and interviews with business process SMEs to gather and confirm business process details & documenting process definitions. More senior developers will lead these workshops and interviews.
• Participate in design and configuration sessions and apply feedback to improve and enhance work products. More senior developers will lead these sessions.
• Work alongside newly trained developers to guide and mentor them.
Qualifications and Skills
• Have mastered or have a strong desire to master a leading RPA tool (UiPath a must, Blue Prism, Automation Anywhere), including advanced RPA vendor certification.
• At least one year of hands-on experience with at least one of the following programming languages (e.g. .Net, Java, VB, C#/C, HTML/CSS, Python, Web Services, mainframe, web applications, SQL, data integration tools, technical automation tools). More senior developers should have a minimum of 2 to 4 years of this hands-on experience.
• Reasonably proficiency in reading Microsoft Office Visio or other equivalent process flow-charting tool or workflow-based logic
• Extra - Any prior work or academic experience with Document management and processing tools (e.g. Kofax, ABBYY, Data Cap), Data integration tools (e.g. Informatica, Microsoft SSIS), Technical automation tools (e.g. shell scripting, PHP), or Business process management tools (e.g. Pega).
Pega CDH Senior Developer
Data engineer job in McLean, VA
A Fortune 50 financial services company is seeking a highly motivated Pega CDH Senior Developer to join their team in the McLean, Virginia area.
Responsibilities:
Design and develop CDH components, including strategies, treatments, predictions, and engagement policies
Configure Next-Best-Action Designer and associated decision flows
Integrate CDH with enterprise data sources and outbound/real-time channels
Participate in Agile ceremonies, estimation, and sprint delivery
Troubleshoot performance and configuration issues
Collaborate with Decisioning Architects, Data Science, and Marketing
Qualifications:
4+ years of Pega Development experience in enterprise environments
Strong experience with Pega CDH, NBA Designer, adaptive models, and real-time decisioning
Hands-on experience with Pega Strategy Designer and arbitration logic
Experience integrating systems using REST/SOAP APIs, Kafka, MQ, or similar tools
Strong understanding of customer profile data, segmentation logic, and propensity modeling
Ability to troubleshoot complex CDH and performance issues
Experience working in Agile teams and contributing to sprint planning and refinement
Strong documentation and communication skills
Desired Skills:
PCDC certification
Experience in the Financial Services and/or Regulated industries
Senior Java Software Engineer
Data engineer job in McLean, VA
Java developer
Note- Ex-Capital One Preferred
JD-
We are looking for Java developer with Angular, AWS (Backend Java, Node)
Senior Frontend Developer
Data engineer job in McLean, VA
🔷 Now Hiring: Senior Front-End Engineer (Angular) - Financial Services (W2 Contract) 🔷
🕒 Employment Type: W2 Contract (Through Sub-Vendor Only)
🗓 Interview: 1 Round | 60 mins | In-Person (Targeting week of Jan 12)
📌 Shortlisting Deadline: January 7
⚠️ IMPORTANT - READ BEFORE APPLYING
Please include the completed candidate template and responses to vetting questions at the top of the resume.
👉
Resumes without completed templates and candidate-provided vetting responses will NOT be considered.
MSP Owner: Jasmine Acuna
Work Authorization: Open to sponsored candidates via sub-vendors only (one layer deep). Candidates must be W2 employees of the sub-vendor.
🧩 Role Overview
Our financial services client is seeking a Senior Front-End Engineer with a strong Angular (v16+) focus to design, build, and maintain a shared UI component library. This role emphasizes UI standards, accessibility, performance, and collaboration with UX and backend teams.
🔧 Key Responsibilities
Design, develop, and maintain reusable Angular components (v16+) within a shared component library
Collaborate closely with UX/UI designers to implement complex design systems and user experiences
Apply advanced HTML/CSS techniques (Flexbox, Grid, theming, responsive design)
Ensure components meet WCAG accessibility standards
Package, test, version, and publish libraries to Artifactory
Implement and maintain CI/CD pipelines for build, test, and deployment automation
Write and maintain unit, integration, and end-to-end tests
Support consumers of the component library with integration and troubleshooting
Participate in Agile ceremonies (stand-ups, sprint planning, retrospectives)
Collaborate with backend teams to ensure seamless API integration
Monitor security vulnerabilities and manage upgrades/migrations
Maintain technical documentation and demo applications
✅ Must Have Qualifications
5+ years of experience in front-end/UI development
Strong hands-on experience with Angular (latest 4 versions - v16+)
Proficient in JavaScript, HTML, CSS
Experience building UI frameworks or component libraries
Familiarity with Bitbucket, code reviews, and branching strategies
Experience working in Agile environments
Strong communication and collaboration skills
⭐ Preferred Skills
React experience (nice to have)
Backend exposure: Java, Spring Boot, SQL
Experience with accessibility (WCAG)
Strong documentation and mentoring capabilities
Analytical mindset with strong problem-solving skills
💼 Why Apply
Long-term contract through 2026
High-impact role within a large financial organization
Work on enterprise-scale UI standards and design systems
Competitive hourly rate with stability
Senior Frontend Developer
Data engineer job in McLean, VA
Front End Developer with Bloomreach
Duration: Long Term
Experience: 10+ Yrs
Employment Type: W2
We are seeking a skilled Software Engineer with strong experience in Angular and Bloomreach to design, develop, and maintain modern, scalable web applications. The ideal candidate will collaborate closely with UX designers, backend engineers, and business stakeholders to deliver high-quality, customer-centric digital experiences using Bloomreach CMS and Angular-based front-end architectures.
Key Responsibilities
Design, develop, and maintain responsive web applications using Angular.
Implement and customize Bloomreach CMS (Content / Experience Manager) components and templates.
Integrate Angular applications with Bloomreach APIs and backend services.
Collaborate with product owners, designers, and backend teams to translate business requirements into technical solutions.
Ensure high performance, security, and scalability of applications.
Write clean, reusable, and maintainable code following best practices.
Perform unit testing, integration testing, and support CI/CD pipelines.
Troubleshoot, debug, and resolve application issues in development and production environments.
Participate in code reviews and contribute to continuous improvement initiatives.
Required Skills & Qualifications
Strong experience with Angular (latest versions), TypeScript, HTML5, and CSS3.
Hands-on experience with Bloomreach CMS / Bloomreach Experience Manager.
Experience integrating RESTful APIs and third-party services.
Solid understanding of component-based architecture and state management.
Familiarity with responsive design and cross-browser compatibility.
Experience with version control systems such as Git.
Knowledge of Agile/Scrum development methodologies.
Preferred Skills
Experience with Bloomreach Personalization, Search, or Content APIs.
Exposure to backend technologies (Java, Node.js, or Spring Boot).
Knowledge of CI/CD tools (Jenkins, GitHub Actions, Azure DevOps, etc.).
Understanding of SEO, performance optimization, and accessibility standards.
Experience working in e-commerce or content-driven platforms.
AWS DevSecOps Engineer
Data engineer job in Ellicott City, MD
VITG is seeking a DevSecOps Engineer responsible for automating security integration throughout the CI/CD pipeline and the AWS cloud environment. This role will "shift security left" by taking the lead on implementing security-as-code tools, managing their usage, ensuring their proper configuration and compliance, and proactively embedding security policy into the development process. Our ideal candidate is passionate about being part of a "change," and working in a dynamic and highly collaborative environment focused on speed, stability, and security.
The DevSecOps Engineer provides hands-on expertise to integrate and maintain the security posture for corporate systems that support Federal programs, ensuring a successful program Authority To Operate (ATO). You will be responsible for developing, monitoring, and maintaining systems and procedures to safeguard internal information systems, networks, and CI/CD pipelines through automation.
Applicant Requirements:
US citizen or must be authorized to work in the United States
Must have lived in the USA for three years of the last five years
Must be able to obtain a US federal gov badge and eligible for Public Trust clearance
Must be able to pass a background check, including a drug test
Job Responsibilities:
Develop, implement, and maintain security automation throughout the entire SDLC, integrating security into the CI/CD pipelines using Jenkins/Github and Infrastructure-as-Code (IaC) principles.
Run and manage security scans with tools such as Snyk (SAST/SCA) and establish automated tracking and enforcement mechanisms for vulnerability remediation.
Integrate and manage security workloads running on AWS containers and ensure container image scanning and runtime security policies are enforced.
Design, manage, and maintain source code for AWS infrastructure in GitHub and manage automated pipelines, ensuring security checks and gates are embedded in every deployment.
Maintain security information on JIRA/Confluence and actively participate in agile DevSecOps practices, promoting a "Secure-by-Design" culture.
Provides hands-on support for developing, coordinating, implementing, and enforcing information systems security policies, standards, and methodologies as code.
Maintain operational security posture for Enterprise Salesforce FISMA system by ensuring security is baked into configuration and deployment practices.
Implement security tools, security tool usage, and policy-as-code to ensure configurations remain compliant and configured properly, all while ensuring a successful program ATO.
Automate vulnerability/risk assessment analysis to support continuous monitoring and authorization.
Manages changes to the system and assesses the security impact of those changes through automated compliance checks.
Assists with the management of security aspects of the information system and performs day-to-day security operations of the system
Evaluate security solutions to ensure they meet security requirements for processing classified information
Performs vulnerability/risk assessment analysis to support certification and accreditation
Prepares and reviews documentation to include System Security Plans (SSPs), Risk Assessment Reports, Certification and Accreditation (C&A) packages, and System Requirements Traceability Matrices (SRTMs)
Qualifications & Skills:
Bachelor's or Master's degree in Computer Science, Engineering, Information Technology, or a related discipline
Minimum of 6 years related experience in Information Technology including 4 years in the DevSecOps or Application Security (AppSec) space.
Demonstrated hands-on experience in cloud environments such as AWS Commercial and GovCloud, specifically with security automation, logging, and monitoring services (e.g., GuardDuty, Security Hub, CloudTrail).
Expertise in CI/CD pipeline management and the integration of security tools for Static Application Security Testing (SAST), Dynamic Application Security Testing (DAST), and Software Composition Analysis (SCA).
Required: Strong hands-on experience with AWS, Snyk, GitHub, JIRA, and Confluence to implement and manage the end-to-end DevSecOps toolchain.
Demonstrated work experience with Infrastructure-as-Code (IaC) security (e.g., using Checkov or Terrascan on Terraform/CloudFormation).
(Preferred) Experience with Salesforce Platform and tool ecosystem
(Preferred) Salesforce or any other platform tool - Configuration/Setup of External Client Applications and Secure Communications (TLS)
(Preferred) AppOmni - Have used it and can manage issues, perform new org additions and configurations.
Strong background in the certification and accreditation process (ATO) and the ability to automate compliance checks against frameworks like FISMA, NIST, and FedRAMP.
Possesses working knowledge of business security practices, current security automation tools, and policy-as-code implementation.
Demonstrated working knowledge of vulnerability assessment and penetration testing processes, focusing on how to automate these checks.
Experience with Government Agency Security Assessment Process in support of maintaining and/or establishing an ATO and the appropriate boundary.
Experience with, understanding of and adherence to guidelines such as FISMA, NIST, HIPPA, and IRS Pub-1075 (Preferred)
Preferred Certifications:
Require AWS DevOps or SysOps or equivalent Certification
Preferably possess industry certification such as the CISSP, CEH, GIAC, etc
Job Type: Full Time
Salary: BOE
Benefits:
401(k) with employer contribution
Medical/Dental/Vision insurance (option for full coverage for employee)
Life, ST/LT insurance
Professional development opportunities
Schedule:
8 hour shift
May include minimal after hours support depending on deployment schedule
Work Type:
Hybrid remote in Ellicott City, MD 21043
1 to 2 days in office weekly
Senior Software Engineer -- KUMDC5680656
Data engineer job in McLean, VA
Required Technical Skills
(Required)
Strong design and development skills in two or more of the following technologies and tools: Java (3-5 years) Cucumber(3-5 years), JBehave or other BDD testing frameworks
At least 8 years of test automation framework design
Strong experience in testing Webservices (REST APIs) (3+5 years)
Proven experience developing test scripts, test cases, and test data
The ability to write queries in SQL or other relational databases
3+ years of experience in developing scenario based performance testing using JMeter
Experience testing full stack and integration testing with 3rd parties
End-to-end system integration testing experience for software platforms
(Desired)
Hands on experience with Python
development experience in AWS Cloud technology
Experience in TDD, continuous integration, code review practice is strongly desired
Experience with Apigee or other API gateways is a plus
Experience with DevOps concepts and tools (e.g., CI/CD, Jenkins, Git)
At least 2 years working on an Agile team with a solid understanding of Agile/Lean practices.
Understanding of a micro service Architecture
Experience load and performance testing
Strong documentation skills
Senior Data Engineer.
Data engineer job in McLean, VA
Immediate need for a talented Senior Data Engineer. This is a 06+months contract opportunity with long-term potential and is located in Mclean, VA(Remote). Please review the job description below and contact me ASAP if you are interested.
Job ID: 25-84666
Pay Range: $64 - $68/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
Demonstrate ability in implementing data warehouse solutions using modern data platforms such as Client, Databricks or Redshift.
Build data integration solutions between transaction systems and analytics platforms.
Expand data integration solutions to ingest data from internal and external sources and to further transform as per the business consumption needs.
Develop tasks for a multitude of data patterns, e.g., real-time data integration, advanced analytics, machine learning, BI and reporting.
Fundamental understanding of building of data products by data enrichment and ML.
Act as a team player and share knowledge with the existing team members.
Key Requirements and Technology Experience:
Key skills; Python, AWS, SNOWFLAKE
Bachelor's degree in computer science or a related field.
Minimum 5 years of experience in building data driven solutions.
At least 3 years of experience working with AWS services.
Applicants must be authorized to work in the US without requiring employer sponsorship currently or in the future. U.S. FinTech does not offer H-1B sponsorship for this position.
Expertise in real-time data solutions, good-to-have knowledge of streams processing, Message Oriented Platforms and ETL/ELT Tools.
Strong scripting experience using Python and SQL.
Working knowledge of foundational AWS compute, storage, networking and IAM.
Understanding of Gen AI models, prompt engineering, RAG, fine tuning and pre-tuning will be a plus.
Solid scripting experience in AWS using Lambda functions.
Knowledge of CloudFormation template preferred.
Hands-on experience with popular cloud-based data warehouse platforms such as Redshift and Client.
Experience in building data pipelines with related understanding of data ingestion, transformation of structured, semi-structured and unstructured data across cloud services.
Knowledge and understanding of data standards and principles to drive best practices around data management activities and solutions.
Experience with one or more data integration tools such as Attunity (Qlik), AWS Glue ETL, Talend, Kafka etc.
Strong understanding of data security - authorization, authentication, encryption, and network security.
Hands on experience in using and extending machine learning framework and libraries, e.g, scikit-learn, PyTorch, TensorFlow, XGBoost etc. preferred.
Experience with AWS SageMaker family of services or similar tools to develop machine learning models preferred.
Strong written and verbal communication skills to facilitate meetings and workshops to collect data, functional and technology requirements, document processes, data flows, gap analysis, and associated data to support data management/governance related efforts.
Acts with integrity and proactively seeks ways to ensure compliance with regulations, policies, and procedures.
Demonstrated ability to be self-directed with excellent organization, analytical and interpersonal skills, and consistently meet or exceed deadline deliverables.
Strong understanding of the importance and benefits of good data quality, and the ability to champion results across functions.
Our client is a leading Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Cloud Data Architect
Data engineer job in McLean, VA
Purpose:
As a Cloud Data Architect, you'll be at the forefront of innovation - guiding clients and teams through the design and implementation of cutting-edge solutions using Databricks, modern data platforms, and cloud-native technologies. In this role, you won't just architect solutions -you'll help grow a thriving Analytics & Data Management practice, act as a trusted Databricks SME, and bring a business-first mindset to every challenge. You'll have the opportunity to lead delivery efforts, build transformative data solutions, and cultivate strategic relationships with Fortune-500 organizations.
Key Result Areas and Activities:
Architect and deliver scalable, cloud-native data solutions across various industries.
Lead data strategy workshops and AI/ML readiness assessments.
Develop solution blueprints leveraging Databricks (Lakehouse, Delta Lake, MLflow, Unity Catalog).
Conduct architecture reviews and build proof-of-concept (PoC) prototypes on platforms like Databricks, AWS, Azure, and Snowflake.
Engage with stakeholders to define and align future-state data strategies with business outcomes.
Mentor and lead data engineering and architecture teams.
Drive innovation and thought leadership across client engagements and internal practice areas.
Promote FinOps practices, ensuring cost optimization within multi-cloud deployments.
Support client relationship management and engagement expansion through consulting excellence.
Roles & Responsibilities
Essential Skills:
10+ years of experience designing and delivering scalable data architecture and solutions.
5+ years in consulting, with demonstrated client-facing leadership.
Expertise in Databricks ecosystem including Delta Lake, Lakehouse, Unity Catalog, and MLflow.
Strong hands-on knowledge of cloud platforms (Azure, AWS, Databricks, and Snowflake).
Proficiency in Spark and Python for data engineering and processing tasks.
Solid grasp of enterprise data architecture frameworks such as TOGAF and DAMA.
Demonstrated ability to lead and mentor teams, manage multiple projects, and drive delivery excellence.
Excellent communication skills with proven ability to consult and influence executive stakeholders.
Desirable Skills
Recognized thought leadership in emerging data and AI technologies.
Experience with FinOps in multi-cloud environments, particularly with Databricks and AWS cost optimization.
Familiarity with data governance and data quality best practices at the enterprise level.
Knowledge of DevOps and MLOps pipelines in cloud environments.
Qualifications:
Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or related fields.
Professional certifications in Databricks, AWS, Azure, or Snowflake preferred.
TOGAF, DAMA, or other architecture framework certifications are a plus.
Qualities:
Self-motivated and focused on delivering outcomes for a fast growing team and firm
Able to communicate persuasively through speaking, writing, and client presentations
Able to consult, write, and present persuasively
Able to work in a self-organized and cross-functional team
Able to iterate based on new information, peer reviews, and feedback
Able to work with teams and clients in different time zones
Research focused mindset
"Infocepts is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law."