Assoc Engineer
Data engineer job in Washington, DC
Who We Are: We're powering a cleaner, brighter future. Exelon is leading the energy transformation, and we're calling all problem solvers, innovators, community builders and change makers. Work with us to deliver solutions that make our diverse cities and communities stronger, healthier and more resilient.
We're powered by purpose-driven people like you who believe in being inclusive and creative, and value safety, innovation, integrity and community service. We are a Fortune 200 company, 19,000 colleagues strong serving more than 10 million customers at six energy companies -- Atlantic City Electric (ACE), Baltimore Gas and Electric (BGE), Commonwealth Edison (ComEd), Delmarva Power & Light (DPL), PECO Energy Company (PECO), and Potomac Electric Power Company (Pepco).
In our relentless pursuit of excellence, we elevate diverse voices, fresh perspectives and bold thinking. And since we know transforming the future of energy is hard work, we provide competitive compensation, incentives, excellent benefits and the opportunity to build a rewarding career.
Are you in? Primary Purpose:
Assists experienced engineers in developing studies, plans, criteria, specifications, calculations, evaluations, design documents, performance assessments, integrated systems analysis, cost estimates, budgets, associated with the planning, design, licensing, construction, operation, and maintenance of Exelon's electric generation, transmission, distribution, gas and telecommunication facilities/systems. Provides analytical support for consultation and recommendations to the Company within and to other business units and/or customers as a result of studying company or customer-owned systems, processes, outages, equipment, vehicles or facilities to advance business needs and efficiencies. Develop recommendations to improve planning, design, installation and maintenance processes. Collects and compiles financial data for budget and actual costs of projects. Position may be required to work extended hours, including 24 x 7 coverage during storms or other energy delivery emergencies.
Note: This is a hybrid position (in-office with remote flexibility). Employees are required to be in office at least three days per week (Tuesday, Wednesday, and Thursday). This position must sit out of our Philadelphia -PA, Kennett Square - PA, Newark - DE and Washington - DC office. This position is not eligible for relocation assistance.
Primary Duties:
Assists experienced engineers in performing well-defined engineering assignments in specialized areas requiring engineering expertise exercising independent discretion. (e.g. Collect data, perform complex analysis, interpret results, draw conclusions, and clearly present a recommendation to management)
Provides support to engineering and operating groups to analyze specific design, installation and maintenance activities. Assists in the performance of complex engineering assignments.(e.g. Analyze and interpret the results of complex power flows and perform complex electrical tests, and analyze non-specific and ambiguous results)
Assists with technical consultation, plan developments and recommendations for customer and company systems and processes; (e.g. Verify and validate studies, blueprints, or designs against accepted engineering principles and practices)
Evaluates effectiveness of current technical systems and processes. Participates on teams; (e.g. Design high voltage transmission and distribution circuits, meeting all engineering standards and criteria)
Job Scope:
Modeling & Scenario Planning develops and maintains system models required to run studies on the transmission system to ensure adherence to reliability criteria. This includes, but is not limited to, running thermal and voltage steady state powerflow studies and maintaining modeling data (connectivity, impedances, ratings, contingency information, etc.). As part of that task, the Team analyzes the risk on the transmission system both in the short term and future years, which provides critical information to anticipate unidentified issues and prioritize existing risks.
The successful candidate will perform Transmission System Modeling and Power Flow Studies/Analysis of the Exelon transmission system. These efforts will include:
Developing models and performing studies using available tools such as PSS/E and TARA
In coordination with PJM, ensuring that Exelon complies with applicable NERC Standards and PJM Operations/Planning Manuals
Leveraging experience performing power flow analysis and validation of study results
Demonstrating an understanding of OpCo Transmission System(s), including Substation and Protection Design and Technical Standards
Utilizing experience with Transmission modeling characteristics such as Ratings and Impedances to ensure proper results
Providing Innovative Solutions for enhancing Planning Techniques and integrating New Technologies
Minimum Qualifications:
Bachelor of Science degree in Engineering
Ability to analyze and interpret complex electrical and mechanical systems.
Knowledge and ability to apply problem solving approaches and engineering theory.
Knowledge of engineering designs, principles and practices.
Zero to two years of professional engineering experience
Limited knowledge and experience with regulations, guides, standards, codes, methods, and practices necessary to perform routine assignments for a specific discipline installations, or service.
Preferred Qualifications:
Written and oral communication/presentation skills, report generation & technical writing skills
Interpersonal skills & the ability to collaborate with peers
Ability to confer with customers and identify customer needs
High level understanding of power systems and/or transmission system power flow
Benefits:
Annual salary will vary based on a candidate's skills, qualifications, experience, and other factors: $71,200.00/Yr. - $97,900.00/Yr.
Annual Bonus for eligible positions: 7%
401(k) match and annual company contribution
Medical, dental and vision insurance
Life and disability insurance
Generous paid time off options, including vacation, sick time, floating and fixed holidays, maternity leave and bonding/primary caregiver leave or parental leave
Employee Assistance Program and resources for mental and emotional support
Wellbeing programs such as tuition reimbursement, adoption and surrogacy assistance and fitness reimbursement
Referral bonus program
And much more
Note: Exelon-sponsored compensation and benefit programs may vary or not apply based on length of service, job grade, job classification or represented status. Eligibility will be determined by the written plan or program documents.
Auto-ApplySenior CNO Developer
Data engineer job in Annapolis, MD
MANTECH seeks a motivated, career and customer-oriented Senior CNO Developer to join our team in Annapolis Junction, Maryland.
We're looking for a Senior Capability Developer to join our elite team. In this role, you'll apply your deep technical expertise to analyze, reverse-engineer, and develop mission-critical capabilities that directly support national security objectives. You will be a key player in a fast-paced environment, tackling unique challenges at the intersection of hardware, software, and embedded systems.
Responsibilities include but are not limited to:
Develop custom software tools and applications using Python, C, and Assembly, focusing on embedded and resource-constrained systems.
Conduct rigorous code reviews to ensure the quality, security, and performance of developed software.
Reverse engineer complex hardware and software systems to understand their inner workings and identify potential vulnerabilities.
Perform in-depth vulnerability research to discover and analyze weaknesses in a variety of targets.
Collaborate with a team of skilled engineers to design and implement innovative solutions to challenging technical problems.
Minimum Qualifications:
Bachelor's degree and 12 years of experience; or, a high school diploma with 16 years of experience; or, an Associate's degree with 14 years of experience. A Master's degree may substitute for 2 years of experience, and a PhD may substitute for 4 years of experience.
Must have 7 years of position-relevant work experience
Proficiency in programming and application development.
Strong scripting skills, particularly in Python, C, and Assembly.
Deep expertise in managing, configuring, and troubleshooting Linux.
Experience in embedded systems.
Experience in reverse engineering and vulnerability research of hardware and software.
Experience in code review.
Preferred Qualifications:
Experience in CNO (Computer Network Operations) Development.
Experience in virtualization.
Knowledge of IoT (Internet of Things) devices.
Experience with Linux Kernel development and sockets.
Knowledge of integrating security tools into the CI/CD (Continuous Integration/Continuous Delivery) pipeline.
Networking skills.
Clearance Requirements:
Must have a current/active Top Secret/SCI clearance.
Physical Requirements:
The person in this position must be able to remain in a stationary position 50% of the time. Occasionally move about inside the office to access file cabinets, office machinery, or to communicate with co-workers, management, and customers, via email, phone, and or virtual communication, which may involve delivering presentations
Data Scientist
Data engineer job in Columbia, MD
Data Scientist - Transit Data Focus_Columbia, MD (On-site / hybrid)_Contract (6 Months)
Data Scientist - Transit Data Focus
Employment type: Contract
Duration: 6 Months
Justification: To manage and analyze customer databases, AVA (automated voice announcement), and schedule data for predictive maintenance and service planning.
Experience Level: 3-5 years
Job Responsibilities:
Collect, process, and analyze transit-related datasets including customer databases, AVA (automated voice announcement) logs, real-time vehicle data, and schedule data.
Develop predictive models and data-driven insights to support maintenance forecasting, service planning, and operational optimization.
Design and implement data pipelines to integrate, clean, and transform large, heterogeneous transit data sources.
Perform statistical analysis and machine learning to identify patterns, trends, and anomalies relevant to transit service performance and reliability.
Collaborate with transit planners, maintenance teams, and IT staff to translate data insights into actionable business strategies.
Monitor data quality and integrity; implement data validation and cleansing processes.
Technical Skills & Qualifications:
Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Transportation Engineering, or a related quantitative field.
3-5 years of experience working as a data scientist or data analyst, preferably in a transit, transportation, or public sector environment.
Strong proficiency in Python or R for data analysis, statistical modeling, and machine learning.
Experience with SQL for database querying, manipulation, and data extraction.
Familiarity with transit data standards such as GTFS, AVL/CAD, APC (Automated Passenger Counters), and AVA systems.
Experience with data visualization tools such as Power BI, or equivalent.
Engineer
Data engineer job in Washington, DC
Who We Are: We're powering a cleaner, brighter future. Exelon is leading the energy transformation, and we're calling all problem solvers, innovators, community builders and change makers. Work with us to deliver solutions that make our diverse cities and communities stronger, healthier and more resilient.
We're powered by purpose-driven people like you who believe in being inclusive and creative, and value safety, innovation, integrity and community service. We are a Fortune 200 company, 19,000 colleagues strong serving more than 10 million customers at six energy companies -- Atlantic City Electric (ACE), Baltimore Gas and Electric (BGE), Commonwealth Edison (ComEd), Delmarva Power & Light (DPL), PECO Energy Company (PECO), and Potomac Electric Power Company (Pepco).
In our relentless pursuit of excellence, we elevate diverse voices, fresh perspectives and bold thinking. And since we know transforming the future of energy is hard work, we provide competitive compensation, incentives, excellent benefits and the opportunity to build a rewarding career.
Are you in? Primary Purpose:
Develops studies, plans, criteria, specifications, calculations, evaluations, design documents, performance assessments, integrated systems analysis, cost estimates, budgets, associated with the planning, design, licensing, construction, operation, and maintenance of Exelon's electric generation, transmission, distribution, gas and telecommunication facilities/systems under the guidance of an experienced engineer. Provides consultation and recommendations to the Company within and to other business units and/or customers as a result of studying company or customer-owned systems, processes, equipment, vehicles or facilities under an experienced engineer. Reviews financial data from budget and actual costs of projects under the guidance of an experienced engineer. Position may be required to work extended hours for coverage during storms or other energy delivery emergencies.
Note: This is a hybrid position (in-office with remote flexibility). Employees are required to be in office at least three days per week (Tuesday, Wednesday, and Thursday). This position must sit out of our Philadelphia -PA, Kennett Square - PA, Washington - DC or Newark - DE office. This position is not eligible for relocation assistance.
Primary Duties:
Performs engineering assignments while exercising independent discretion under the guidance of an experienced engineer. (e.g. Collect data, perform complex analysis, interpret results, draw conclusions, and clearly present a recommendation to management)
Performs engineering tasks associated with large projects or a number of small projects. (e.g. Analyze and interpret the results of complex power flows and perform complex engineering tests, and analyze non-specific and ambiguous results)
May direct the engineering tasks associated with a large project or a number of small projects (e.g. Verify and validate studies, blueprints, or designs against accepted engineering principles and practices. Design high voltage transmission and distribution circuits, meeting all engineering standards and criteria)
Participates on teams and may lead teams.
Job Scope:
Modeling & Scenario Planning develops and maintains system models required to run studies on the transmission system to ensure adherence to reliability criteria. This includes, but is not limited to, running thermal and voltage steady state powerflow studies and maintaining modeling data (connectivity, impedances, ratings, contingency information, etc.). As part of that task, the Team analyzes the risk on the transmission system both in the short term and future years, which provides critical information to anticipate unidentified issues and prioritize existing risks.
The successful candidate will perform Transmission System Modeling and Power Flow Studies/Analysis of the Exelon transmission system. These efforts will include:
Developing models and performing studies using available tools such as PSS/E and TARA
In coordination with PJM, ensuring that Exelon complies with applicable NERC Standards and PJM Operations/Planning Manuals
Leveraging experience performing power flow analysis and validation of study results
Demonstrating an understanding of OpCo Transmission System(s), including Substation and Protection Design and Technical Standards
Utilizing experience with Transmission modeling characteristics such as Ratings and Impedances to ensure proper results
Providing Innovative Solutions for enhancing Planning Techniques and integrating New Technologies
Minimum Qualifications:
Bachelor of Science degree in Engineering
2 - 4 years of professional engineering experience
Ability to analyze and interpret complex electrical and mechanical systems.
Knowledge and ability to apply problem solving approaches and engineering theory.
Knowledge of engineering designs, principles and practices.
General knowledge and experience with regulations, guides, standards, codes, methods, and practices necessary to perform assignments for a specific discipline, various installations, or services
Preferred Qualifications:
Engineer in Training License
Strong written and oral communication/presentation skills, report generation & technical writing skills
Interpersonal skills & the ability to collaborate with peers and managers
Time, project management and multi-tasking skills
Ability to analyze industry wide trends and implement enhancements
A working knowledge of analysis software packages such as CYMDIST, PSS\E, TARA, Python, PSCAD, MATLAB, etc. to perform and analyze load flow modeling, contingency studies, and transfer analysis
Benefits:
Annual salary will vary based on a candidate's skills, qualifications, experience, and other factors: $83,200.00/Yr. - $114,400.00/Yr.
Annual Bonus for eligible positions: 10%
401(k) match and annual company contribution
Medical, dental and vision insurance
Life and disability insurance
Generous paid time off options, including vacation, sick time, floating and fixed holidays, maternity leave and bonding/primary caregiver leave or parental leave
Employee Assistance Program and resources for mental and emotional support
Wellbeing programs such as tuition reimbursement, adoption and surrogacy assistance and fitness reimbursement
Referral bonus program
And much more
Note: Exelon-sponsored compensation and benefit programs may vary or not apply based on length of service, job grade, job classification or represented status. Eligibility will be determined by the written plan or program documents.
Auto-ApplyAzure Data Modeler
Data engineer job in Washington, DC
Azure Data Modeler - Budget Transformation Project
Our client is embarking on a major budget transformation initiative and is seeking an experienced Azure Data Modeler to support data architecture, modeling, and migration activities. This role will play a critical part in designing and optimizing data structures as the organization transitions to SAP. Experience with SAP is preferred, but strong ERP data experience in any platform is also valuable.
Responsibilities
Design, develop, and optimize data models within the Microsoft Azure environment.
Support data architecture needs across the budget transformation program.
Partner with cross-functional stakeholders to enable the transition to SAP (or other ERP systems).
Participate in data migration planning, execution, and validation efforts.
Work collaboratively within SAFe Agile teams and support sprint activities.
Provide off-hours support as needed for critical tasks and migration windows.
Engage onsite in Washington, DC up to three days per week.
Required Qualifications
Strong hands-on expertise in data architecture and data model design.
Proven experience working with Microsoft Azure (core requirement).
Ability to work flexibly, including occasional off-hours support.
Ability to be onsite in Washington, DC as needed (up to 3 days/week).
Preferred Qualifications
Experience with SAP ECC or exposure to SAP implementations.
Experience with other major ERP systems (Oracle, Workday, etc.).
SAFe Agile certification.
Dexian stands at the forefront of Talent + Technology solutions with a presence spanning more than 70 locations worldwide and a team exceeding 10,000 professionals. As one of the largest technology and professional staffing companies and one of the largest minority-owned staffing companies in the United States, Dexian combines over 30 years of industry expertise with cutting-edge technologies to deliver comprehensive global services and support.
Dexian connects the right talent and the right technology with the right organizations to deliver trajectory-changing results that help everyone achieve their ambitions and goals. To learn more, please visit ********************
Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
Data Engineer
Data engineer job in McLean, VA
Immediate need for a talented Data Engineer. This is a 12 months contract opportunity with long-term potential and is located in Mclean, VA(Hybrid). Please review the job description below and contact me ASAP if you are interested.
Job ID: 25-93504
Pay Range: $70 - $75/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
Design, develop, and maintain data pipelines leveraging Python, Spark/PySpark, and cloud-native services.
Build and optimize data workflows, ETL processes, and transformations for large-scale structured and semi-structured datasets.
Write advanced and efficient SQL queries against Snowflake, including joins, window functions, and performance tuning.
Develop backend and automation tools using Golang and/or Python as needed.
Implement scalable, secure, and high-quality data solutions across AWS services such as S3, Lambda, Glue, Step Functions, EMR, and CloudWatch.
Troubleshoot complex production data issues, including pipeline failures, data quality gaps, and cloud environment challenges.
Perform root-cause analysis and implement automation to prevent recurring issues.
Collaborate with data scientists, analysts, platform engineers, and product teams to enable reliable, high-quality data access.
Ensure compliance with enterprise governance, data quality, and cloud security standards.
Participate in Agile ceremonies, code reviews, and DevOps practices to ensure high engineering quality.
Key Requirements and Technology Experience:
Skills-Data Engineer- Python , Spark/PySpark, AWS, Golang, Able to write complex SQL queries against Snowflake tables / Troubleshoot issues, Java/Python, AWS (Glue, EC2, Lambda).
Proficiency in Python with experience building scalable data pipelines or ETL processes.
Strong hands-on experience with Spark/PySpark for distributed data processing.
Experience writing complex SQL queries (Snowflake preferred), including optimization and performance tuning.
Working knowledge of AWS cloud services used in data engineering (S3, Glue, Lambda, EMR, Step Functions, CloudWatch, IAM).
Experience with Golang for scripting, backend services, or performance-critical processes.
Strong debugging, troubleshooting, and analytical skills across cloud and data ecosystems.
Familiarity with CI/CD workflows, Git, and automated testing.
Our client is a leading Banking and Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Data Architect
Data engineer job in Arlington, VA
• Functions as the primary technical architect for data warehousing projects to solve business intelligence challenges
• Possesses deep technical expertise in database design, ETL (OWB/ODI), reporting, and analytics
• Previous consulting experience utilizing an agile delivery methodology
Position Requirements
• Solutions architect must have expertise as both a solutions architect and AI architect.
• 3+ years experience with Azure ETL processing
• 3+ years experience utilizing data warehousing methodologies and processes
• Strong conceptual, analytical, and decision-making skills
• Knowledge and Experience of dimensional modeling
• Strong knowledge of Azure Databricks
• Proficiency in creating PL/SQL packages
• Full SDLC and Data Modeling experience
• Ability to create both logical and physical data models
• Ability to tune databases for maximum performance
• Experience in Data Preparation: Data Profiling, Data Cleansing, and Data Auditing
• Ability to work with Business Analysts to create functional specifications and data
• Manages QA functions
• Develops unit, system, and integration test plans and manages execution
• Ability to write technical and end-user system documentation
• Excellent written and oral communication skills
• Experience transforming logical business requirements into appropriate schemas and models
• Ability to analyze and evaluate moderate to highly complex information systems by being able to interpret such devices as Entity Relation Diagrams, data dictionaries, record layouts, and logic flow diagrams
Lead Data Engineer
Data engineer job in Reston, VA
Imagine working at Intellibus to engineer platforms that impact billions of lives around the world. With your passion and focus we will accomplish great things together!
Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry.
Are you a data enthusiast with a natural ability for analytics? We're looking forward skilled Data/Analytics Engineers to fill multiple roles for our exciting new client. This is your chance to shine, demonstrating your dedication and commitment in a role that promises both challenge and reward.
What We Offer:
A dynamic environment where your skills will make a direct impact. The opportunity to work with cutting-edge technologies and innovative projects. A collaborative team that values your passion and focus.
We are looking for Engineers who can
Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake.
Implement ETL (Extract, Transform, Load) processes using Snowflake's features such as Snowpipe, Streams, and Tasks.
Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs.
Optimize data warehouse performance and scalability using Snowflake features like clustering, partitioning, and materialized views.
Integrate Snowflake with external systems and data sources, including on-premises databases, cloud storage, and third-party APIs.
Implement data synchronization processes to ensure consistency and accuracy of data across different systems.
Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management features.
Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency.
Work on Snowflake modeling - roles, databases, schemas, ETL tools with cloud-driven skills
Work on SQL performance measuring, query tuning, and database tuning
Handle SQL language and cloud-based technologies
Set up the RBAC model at the infra and data level.
Work on Data Masking / Encryption / Tokenization, Data Wrangling / Data Pipeline orchestration (tasks).
Setup AWS S3/EC2, Configure External stages, and SQS/SNS
Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks)
Key Skills & Qualifications:
ETL - Experience with ETL processes for data integration.
SQL - Strong SQL skills for querying and data manipulation
Python - Strong command of Python, especially in AWS Boto3, JSON handling, and dictionary operations
Unix - Competent in Unix for file operations, searches, and regular expressions
AWS - Proficient with AWS services including EC2, Glue, S3, Step Functions, and Lambda for scalable cloud solutions
Database Modeling - Solid grasp of database design principles, including logical and physical data models, and change data capture (CDC) mechanisms.
Snowflake - Experienced in Snowflake for efficient data integration, utilizing features like Snowpipe, Streams, Tasks, and Stored Procedures.
Airflow - Fundamental knowledge of Airflow for orchestrating complex data workflows and setting up automated pipelines
Bachelor's degree in Computer Science, or a related field is preferred. Relevant work experience may be considered in lieu of a degree.
Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and stakeholders.
Proven leadership abilities, with experience mentoring junior developers and driving technical excellence within the team.
We work closely with
Data Wrangling
ETL
Talend
Jasper
Java
Python
Unix
AWS
Data Warehousing
Data Modeling
Database Migration
RBAC model
Data migration
Our Process
Schedule a 15 min Video Call with someone from our Team
4 Proctored GQ Tests (< 2 hours)
30-45 min Final Video Interview
Receive Job Offer
If you are interested in reaching out to us, please apply and our team will contact you within the hour.
Senior Data Engineer
Data engineer job in Washington, DC
Job Title: Senior Data Engineer
Only USC/GC is considered
Contract
The Senior Data Engineer will join a newly established Data Services team within a mission-driven nonprofit organization that delivers critical services to government customers. As the organization expands its use of data to drive decision-making across multiple business units, the Data Services team is building a modern data lakehouse platform to provide clean, timely, and accurate data. This role will play a foundational part in designing and operationalizing enterprise data pipelines and infrastructure that empower stakeholders with reliable insights.
Summary
The Senior Data Engineer supports the development and maintenance of a system-wide analytics platform that enables secure, scalable access to enterprise data. This role owns end-to-end engineering efforts across ingestion, transformation, orchestration, and delivery using Azure and Microsoft Fabric technologies. The engineer will develop and optimize ETL / data pipelines, implement medallion architecture patterns, and ensure enterprise data assets are structured, integrated, and governed to meet the needs of diverse business units and external parties.
Key Responsibilities
Data Pipeline Design & Development
• Design, develop, and implement end-to-end data ingestion and processing pipelines using Azure and Microsoft Fabric tools.
• Transform raw bronze data into silver (cleaned) and gold (curated, analytics-ready) layers following the Medallion architecture.
• Develop code and tooling to process and transform data into enterprise data models.
• Implement existing ETL frameworks, patterns, and standards used across the enterprise.
Orchestration, Automation & Operations
• Schedule, orchestrate, and monitor automated and semi-automated data pipelines to ensure reliability and quality.
• Build automated workflows supporting ingestion and transformation observability.
• Ensure technical correctness, timeliness, and high-quality delivery of data services.
Data Modeling, Integration & Governance
• Serve as a subject matter expert on enterprise data sources, structures, and definitions.
• Build and maintain data relationships, mappings, and linkages within the Enterprise Data Warehouse (EDW).
• Perform integration of data assets to support analytics and operational needs across multiple mission-driven departments.
• Create and manage large-scale data warehouses and lakehouse components to ensure efficient data access and retrieval.
Collaboration & Communication
• Partner with analysts, business units, and data consumers to support exploration and decision-making.
• Communicate clearly and effectively with technical and non-technical stakeholders, including senior leadership and customers.
• Champion continuous improvement, accountability, and evidence-based decision-making within the Data Services team.
Qualifications
• Minimum five years of experience working with cloud data platforms such as Azure, Snowflake, AWS Redshift, or Databricks.
• Minimum six years of experience in SQL-based data processing.
• Minimum five years of application development experience using Python.
• At least two years of experience developing ETL pipelines within Microsoft Fabric.
• Strong working knowledge of data warehousing and data lake concepts, including medallion or similar architectural patterns.
• Demonstrated ability to deliver high-quality work on schedule and uphold team accountability standards.
• Proven track record of clear and continuous communication across technical and business audiences.
• Commitment to process improvement and leveraging information to enhance organizational performance.
Technical Skills
• Python: Intermediate to advanced proficiency for data processing, automation, and pipeline development.
• SQL: Intermediate to advanced proficiency for transformations, modeling, and performance optimization.
• Experience with Azure Data Factory, Microsoft Fabric Data Engineering, Delta Lake, or similar technologies.
• Familiarity with orchestration frameworks, metadata management, and modern ETL/ELT patterns.
Senior Data Engineer - Data Intelligence
Data engineer job in Baltimore, MD
Hybrid Role
Job Title: Data Intelligence - Engineer, Data Sr
Project Type: Contract
Duration of the project: 12 Months
We are looking for candidates with 5+ years' experience in Ab Initio Administration. (Internal Notes: Please do not send Developers)
Must have 2+ years' experience with AWS. Working with Ab Initio in AWS Cloud.
Must have solid experience building, installing and configuring Ab Initio.
Must have AWS EKS containerization. Will be involved moving Linux instances to AWS EKS.
Ab Initio Lead Infrastructure
This position is an Ab Initio Administrator and not a developer position. The Senior Ab Initio ETL Administrator is responsible for the tasks involved in administration of ETL tool (Ab-Initio) as well as migrating Ab Initio infrastructure to the Cloud. The candidate will support the implementation of a Data Integration/Data Warehouse for the Data products on-prem and in AWS Cloud like EKS containerization for Ab Initio.
6 - 8 Years' Experience
At least 6 years of Experienced with all the tasks involved in administration of ETL Tool (Ab Initio)
Experienced with managing the project of migration or infrastructure build without supervisor
At least 6 years of Experienced with Advance knowledge of Ab Initio Graphical Development Environment (GDE), Meta Data Hub, Operational Console
Experience with Ab Initio, AWS EKS, S3, Dynamo DB, Mongo DB, ProgreSQL, RDS, DB2
Created Big Data pipelines (ETL) from on-premises to Data Factories, Data Lakes, and Cloud Storage such as EBS or S3.
DevOps (CI/CD Pipeline) prefers Jenkins experience
Experience with Advance knowledge of UNIX and SQL
Experience with manage metadata hub-MDH, Operational Console and troubleshoot environmental issues which affect these components
Experience with scripting and automation such as design and develop automated ETL process and architecture and unit testing of the ETL code
Experience with working on the break fix and continuous development items, review, and inspection for the production changes
Perform the code review for the ETL code developed by the development team and guide to resolve an issue.
Service Oriented Architecture (SOA) knowledge and Demonstrated knowledge and best practices of testing environments and processes
Demonstrated experience working in an Enterprise environment with crossed team interaction and collaboration and policies
Strong testing skills
Excellent problem-solving skills
Strong analytical skills
Excellent verbal and written communications skills
Familiar with structured programming techniques
Must be able to perform assigned tasks with minimum supervision
Strong documentation skills
Experience working in an Agile environment is a plus
Software:
Applies and implements best practices for data auditing, scalability, reliability, and application performance.
AWS certification is a plus
Extensive UNIX AIX or Linux and Scripting experience
Extensive SDLC experience with some development or Systems programming experience
Ability to analyze and trouble-shoot Mid-tier/infrastructure issues.
Very strong verbal and written communication skills (Critical)
Ability to facilitate technical requirements gathering and design sessions
Collaborate and interpret business and technical needs
Excellent attention to detail and quality work products (Critical)
Strong customer service skills with internal and external customers (Critical)
Must be able to perform assigned tasks with minimum supervision (Critical)
Strong analytical and documentation skills
Excellent time management ability. (Critical)
Skills Preferred
Experience with DEVOPS or IAAS
AIX or Linux
LDAP
EIAM (Identity Access Management)
Ab Initio Admin and Architect
Junior Data Scientist (TS/SCI)
Data engineer job in Springfield, VA
We are seeking a junior-level Data Science professional with a strong academic foundation and early hands-on experience to join our team as a Exploitation Specialist. The ideal candidate will hold a bachelor's degree in a data science-related field and bring internship or project experience that demonstrates curiosity, initiative, and a willingness to learn from senior team members. This role is a great opportunity for someone eager to grow their technical skill set while supporting a high-impact mission.
Required Qualifications
Active TS/SCI clearance with the willingness to obtain a CI polygraph
Ability to work onsite in Northern Virginia, 40 hours per week (telework options are extremely limited)
Proficiency with Python and SQL
Preferred Qualifications
Familiarity with GEOINT collection and related NGA/NRO systems
Experience with additional programming languages such as R, JavaScript, HTML, and CSS
Understanding of object-oriented programming
Experience using visualization tools such as Grafana, Tableau, or Kibana
Ability to quickly learn new technologies, adapt to evolving mission requirements, and support the development/testing of new analytic methodologies
Cloud Data Engineer- Databricks
Data engineer job in McLean, VA
Purpose:
We are seeking a highly skilled Cloud Data Engineer with deep expertise in Databricks and modern cloud platforms such as AWS, Azure, or GCP. This role is ideal for professionals who are passionate about building next-generation data platforms, optimizing complex data workflows, and enabling advanced analytics and AI in cloud-native environments. You'll have the opportunity to work with Fortune-500 organizations in data and analytics, helping them unlock the full potential of their data through innovative, scalable solutions.
Key Result Areas and Activities:
Design and implement robust, scalable data engineering solutions.
Build and optimize data pipelines using Databricks, including serverless capabilities, Unity Catalog, and Mosaic AI.
Collaborate with analytics and AI teams to enable real-time and batch data workflows.
Support and improve cloud-native data platforms (AWS, Azure, GCP).
Ensure adherence to best practices in data modeling, warehousing, and governance.
Contribute to automation of data workflows using CI/CD, DevOps, or DataOps practices.
Implement and maintain workflow orchestration tools like Apache Airflow and dbt.
Roles & Responsibilities
Essential Skills
4+ years of experience in data engineering with a focus on scalable solutions.
Strong hands-on experience with Databricks in a cloud environment.
Proficiency in Spark and Python for data processing.
Solid understanding of data modeling, data warehousing, and architecture principles.
Experience working with at least one major cloud provider (AWS, Azure, or GCP).
Familiarity with CI/CD pipelines and data workflow automation.
Desirable Skills
Direct experience with Unity Catalog and Mosaic AI within Databricks.
Working knowledge of DevOps/DataOps principles in a data engineering context.
Exposure to Apache Airflow, dbt, and modern data orchestration frameworks.
Qualifications
Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field.
Relevant certifications in cloud platforms (AWS/Azure/GCP) or Databricks are a plus.
Qualities:
Able to consult, write, and present persuasively
Able to work in a self-organized and cross-functional team
Able to iterate based on new information, peer reviews, and feedback
Able to work seamlessly with clients across multiple geographies
Research focused mindset
Excellent analytical, presentation, reporting, documentation and interactive skills
"Infocepts is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law."
Data Architect
Data engineer job in Washington, DC
Job Title: Developer Premium I
Duration: 7 Months with long term extension
Hybrid Onsite: 4 days per week from Day 1, with a full transition to 100% onsite anticipated soon
Job Requirement:
Strong expertise in Data Architecture & Date model design.
MS Azure (core experiment)
Experience with SAP ECC preferred
SAFE agile certification is a plus
Ability to work flexibility including off hours to support critical IT task & migration activities.
Educational Qualifications and Experience:
Bachelor's degree in Computer Science, Information Systems or in a related area of expertise.
Required number of years of proven experience in the specific technology/toolset as per Experience Matrix below for each Level.
Essential Job Functions:
Take functional specs and produce high quality technical specs
Take technical specs and produce completed and well tested programs which meet user satisfaction and acceptance, and precisely reflect the requirements - business logic, performance, and usability requirements
Conduct/attend requirements definition meetings with end-users and document system/business requirements
Conduct Peer Review on Code and Test Cases, prepared by other team members, to assess quality and compliance with coding standards
As required for the role, perform end-user demos of proposed solution and finished product, provide end user training and provide support for user acceptance testing
As required for the role, troubleshoot production support issues and find appropriate solutions within defined SLA to ensure minimal disruption to business operations
Ensure that Bank policies, procedures, and standards are factored into project design and development
As required for the role, install new release, and participate in upgrade activities
As required for the role, perform integration between systems that are on prem and also on the cloud and third-party vendors
As required for the role, collaborate with different teams within the organization for infrastructure, integration, database administration support
Adhere to project schedules and report progress regularly
Prepare weekly status reports and participate in status meetings and highlight issues and constraints that would impact timely delivery of work program items
Find the appropriate tools to implement the project
Maintain knowledge of current industry standards and practices
As needed, interact and collaborate with Enterprise Architects (EA), Office of Information Security (OIS) to obtain approvals and accreditations
“Mindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of - Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans.”
Lead Principal Data Solutions Architect
Data engineer job in Reston, VA
*****TO BE CONSIDERED, CANDIDATES MUST BE U.S. CITIZEN*****
***** TO BE CONSIDERED, CANDIDATES MUST BE LOCAL TO THE DC/MD/VA METRO AREA AND BE OPEN TO A HYBIRD SCHEDULE IN RESTON, VA*****
Formed in 2011, Inadev is focused on its founding principle to build innovative customer-centric solutions incredibly fast, secure, and at scale. We deliver world-class digital experiences to some of the largest federal agencies and commercial companies. Our technical expertise and innovations are comprised of codeless automation, identity intelligence, immersive technology, artificial intelligence/machine learning (AI/ML), virtualization, and digital transformation.
POSITION DESCRIPTION:
Inadev is seeking a strong Lead Principal Data Solutions Architect Primary focus will be in Natural language processing (NLP), applying data mining techniques, doing statistical analysis and building high quality prediction systems.
PROGRAM DESCRIPTION:
This initiative focuses on modernizing and optimizing a mission-critical data environment within the immigration domain to enable advanced analytics and improved decision-making capabilities. The effort involves designing and implementing a scalable architecture that supports complex data integration, secure storage, and high-performance processing. The program emphasizes agility, innovation, and collaboration to deliver solutions that meet evolving stakeholder requirements while maintaining compliance with stringent security and governance standards.
RESPONSIBILITES:
Leading system architecture decisions, ensuring technical alignment across teams, and advocating for best practices in cloud and data engineering.
Serve as a senior technical leader and trusted advisor, driving architectural strategy and guiding development teams through complex solution design and implementation
Serve as the lead architect and technical authority for enterprise-scale data solutions, ensuring alignment with strategic objectives and technical standards.
Drive system architecture design, including data modeling, integration patterns, and performance optimization for large-scale data warehouses.
Provide expert guidance to development teams on Agile analytics methodologies and best practices for iterative delivery.
Act as a trusted advisor and advocate for the government project lead, translating business needs into actionable technical strategies.
Oversee technical execution across multiple teams, ensuring quality, scalability, and security compliance.
Evaluate emerging technologies and recommend solutions that enhance system capabilities and operational efficiency.
NON-TECHNICAL REQUIREMENTS:
Must be a U.S. Citizen.
Must be willing to work a HYRBID Schedule (2-3 Days) in Reston, VA & client locations in the Northern Virginia/DC/MD area as required.
Ability to pass a 7-year background check and obtain/maintain a U.S. Government Clearance
Strong communication and presentation skills.
Must be able to prioritize and self-start.
Must be adaptable/flexible as priorities shift.
Must be enthusiastic and have passion for learning and constant improvement.
Must be open to collaboration, feedback and client asks.
Must enjoy working with a vibrant team of outgoing personalities.
MANDATORY REQUIREMENTS/SKILLS:
Bachelor of Science degree in Computer Science, Engineering or related subject and at least 10 years of experience leading architectural design of enterprise-level data platforms, with significant focus on Databricks Lakehouse architecture.
Experience within the Federal Government, specifically DHS is preferred.
Must possess demonstrable experience with Databricks Lakehouse Platform, including Delta Lake, Unity Catalog for data governance, Delta Sharing, and Databricks SQL for analytics and BI workloads.
Must demonstrate deep expertise in Databricks Lakehouse architecture, medallion architecture (Bronze/Silver/Gold layers), Unity Catalog governance framework, and enterprise-level integration patterns using Databricks workflows and Auto Loader.
Knowledge of and ability to organize technical execution of Agile Analytics using Databricks Repos, Jobs, and collaborative notebooks, proven by professional experience.
Expertise in Apache Spark on Databricks, including performance optimization, cluster management, Photon engine utilization, and Delta Lake optimization techniques (Z-ordering, liquid clustering, data skipping).
Proficiency in Databricks Unity Catalog for centralized data governance, metadata management, data lineage tracking, and access control across multi-cloud environments.
Experience with Databricks Delta Live Tables (DLT) for declarative ETL pipeline development and data quality management.
Certification in one or more: Databricks Certified Data Engineer Associate/Professional, Databricks Certified Solutions Architect, AWS, Apache Spark, or cloud platform certifications.
DESIRED REQUIREMENTS/SKILLS:
Expertise in ETL tools.
Advanced knowledge of cloud platforms (AWS preferred; Azure or GCP a plus).
Proficiency in SQL, PL/SQL, and performance tuning for large datasets.
Understanding of security frameworks and compliance standards in federal environments.
PHYSICAL DEMANDS:
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions
Inadev Corporation does not discriminate against qualified individuals based on their status as protected veterans or individuals with disabilities and prohibits discrimination against all individuals based on their race, color, religion, sex, sexual orientation/gender identity, or national origin.
SharePoint Engineer
Data engineer job in Washington, DC
BlueWater Federal is looking for a SharePoint Engineer to support the Department of Energy in Washington, DC.
As the SharePoint Engineer, you will be responsible for designing, implementing, and maintaining SharePoint environments and solutions. This includes configuring sites, libraries, workflows, and web parts, ensuring system security, and supporting business processes through automation and integration.
Responsibilities
• Install, configure, and maintain SharePoint servers (on-premises and/or SharePoint Online).
• Monitor system performance, troubleshoot issues, and apply patches or updates.
• Manage permissions, security settings, and compliance requirements.
• Design and deploy SharePoint solutions, including custom workflows, forms, and web parts.
• Migrate data and content from legacy systems to SharePoint using scripts or third-party tools.
• Customize SharePoint sites to meet organizational needs.
• Collaborate with IT teams and IA.
• Provide technical support to end-users and site owners and create documentation
• Ensure adherence to security standards and organizational policies.
• Maintain knowledge of SharePoint best practices and emerging technologies.
Qualifications
• Bachelor's degree
• 10+ years of experience with SharePoint administration with a deep understanding of SharePoint Architecture, features and best practices.
• Must have an active Top Secret clearance with the ability to obtain a Q and SCI clearance
• Proficiency in PowerShell scripting for automation.
• Experience with migrating SharePoint versions on-premises or online (preferably using ShareGate)
• SharePoint components (Search, Taxonomy, Managed Metadata).
• Patching SharePoint server to meet organization security standards.
• Experience with HTML, CSS, JavaScript, REST API, and SQL is preferred
BlueWater Federal Solutions is proud to be an Equal Opportunity Employer. All qualified candidates will be considered without regard to race, color, religion, national origin, age, disability, sexual orientation, gender identity, status as a protected veteran, or any other characteristic protected by law. BlueWater Federal Solutions is a VEVRAA federal contractor and we request priority referral of veterans.
Senior Software Engineer
Data engineer job in Springfield, VA
Job Title: Senior Software Engineer
Security Clearance: Active TS/SCI (or SCI eligibility)
Omni Federal is a mid-size business focused on modern application development, cloud and data analytics for the Federal government. Our past performance is a mix of commercial and federal business that allows us to leverage the latest commercial technologies and processes and adapt them to the Federal government. Omni Federal designs, builds and operates data-rich applications leveraging advanced data modeling, machine learning and data visualization techniques to empower our customers to make better data-driven decisions
.
We are seeking a strong Software Engineer to support an NGA project in Springfield, VA. This is an exciting Modernization initiative where the NGA is embracing modern software development practices and using them to solve challenging missions & provide various capabilities for the NGA. This includes a modern technology stack, rapid prototyping in support of intelligence analysis products and capabilities, and culture of innovation. Candidates must be passionate, energized and excited to work on modern architectures and solve challenging problems for our clients.
Required Skills:
BS or equivalent in Computer Science, Engineering, Mathematics, Information Systems or equivalent technical degree.
10+ years of experience in software engineering/development, or a related area that demonstrates the ability to successfully perform the duties associated with this work.
Experience in Java or Python enterprise application development
Experience building high performance applications in React.js
Web services architecture, design, and development
Experience in PostgreSQL database design
Experience working in AWS and utilizing specific AWS tooling (S3)
Senior Developer
Data engineer job in McLean, VA
The candidate must have experience with both Java and Python, in addition to PowerShell and AI/ML tools.
Must Have Qualifications: Python and Java development, strong understanding of design and explain and educate developers.
Key Responsibilities:
Design and implement developer workspaces using physical, virtualized, or browser-based solutions.
Develop tools primarily in Python and Java to enhance developer workflows.
Advocate for and implement CI/CD improvements through new tooling and commonly available libraries.
Create patterns to manage desktop provisioning and software package management using SCCM, VDI, or similar technologies.
Lead initiatives to integrate Generative AI capabilities into Developer workflows, enhancing the value proposition for customers.
Partner with end-user collaboration suites to create seamless developer experiences.
Ensure all solutions meet audit, risk, and governance requirements.
Evangelize best practices and solutions within the developer community.
Java Software Engineer
Data engineer job in Fort Meade, MD
Clearance Required: Top Secret/SCI with Polygraph
Seeking an experienced Software Engineer who will support mission-critical systems for a Federal customer at Fort Meade, Maryland. This entry-level role focuses on developing and maintaining Java-based applications in a secure environment. The engineer will work closely with senior developers and system engineers to deliver reliable and scalable software solutions.
Key Responsibilities
· Design, develop, test, and maintain Java applications.
· Participate in software development lifecycle activities including requirements analysis and design.
· Write clean, efficient, and well-documented code.
· Collaborate with cross-functional teams to troubleshoot and resolve issues.
· Support integration and deployment of software in secure environments.
Qualifications
Education: Bachelor's degree in Computer Science, Software Engineering, or related field.
Experience: 4+ years of experience in software development, preferably in Java.
Preference: Java certification or relevant coursework preferred.
Skills:
· Proficiency in Java programming language.
· Understanding of object-oriented design and development.
· Familiarity with development tools such as Eclipse or IntelliJ.
· Basic knowledge of secure coding practices and federal IT standards.
Software Integrator
Data engineer job in Manassas, VA
Software Integrator - 100% On Site in Manassas, VA
Client is seeking to hire a Software Integrator to support the Acoustics Rapid COTS Insertion (ARCI) program.
Education:
Bachelor's degree in Computer/Electrical Engineering or Computer Science degree from an accredited university.
2+ years of experience.
Job Responsibilities:
Participate in software development lifecycle including software design, development, integration, test, and support for new and existing software products.
Designing, implementing, testing and debugging complex software applications
Support continuous integration/continuous development agile like development
Basic Qualifications:
Bachelor's degree in Computer/Electrical Engineering or Computer Science degree from an accredited university or equivalent related experience.
Experience with Linux Operating Systems
2+ years of related C, C++, and/or JAVA experience
Experience with inter-process communications and real time systems
Experience with configuration management software (i.e. Subversion and/or GIT)
Senior Software Engineer -- KUMDC5680656
Data engineer job in McLean, VA
Required Technical Skills
(Required)
Strong design and development skills in two or more of the following technologies and tools: Java (3-5 years) Cucumber(3-5 years), JBehave or other BDD testing frameworks
At least 8 years of test automation framework design
Strong experience in testing Webservices (REST APIs) (3+5 years)
Proven experience developing test scripts, test cases, and test data
The ability to write queries in SQL or other relational databases
3+ years of experience in developing scenario based performance testing using JMeter
Experience testing full stack and integration testing with 3rd parties
End-to-end system integration testing experience for software platforms
(Desired)
Hands on experience with Python
development experience in AWS Cloud technology
Experience in TDD, continuous integration, code review practice is strongly desired
Experience with Apigee or other API gateways is a plus
Experience with DevOps concepts and tools (e.g., CI/CD, Jenkins, Git)
At least 2 years working on an Agile team with a solid understanding of Agile/Lean practices.
Understanding of a micro service Architecture
Experience load and performance testing
Strong documentation skills