Post job

Data engineer jobs in Fairland, MD

- 6,773 jobs
All
Data Engineer
Data Scientist
Data Architect
Requirements Engineer
Software Systems Engineer
Senior Software Engineer
Data Modeler
Lead Data Architect
Lead Data Technician
  • Cyber Software Engineer - SIGINT Systems

    Lockheed Martin 4.8company rating

    Data engineer job in Severn, MD

    What We're Doing: Lockheed Martin, Cyber & Intelligence invites you to step up to one of today's most daunting challenges: the use of advanced electronics to undermine our way of life. As a cyber security professional at Lockheed Martin, you'll protect the networks that our citizens and the world depend upon each minute: Financial assets. Healthcare information. Critical infrastructure. Hazardous materials. The uninterrupted flow of energy that keeps modern life moving. Here, you'll work with cybersecurity experts on the forefront of threat protection and proactive prevention. In this fast-paced, real-world environment, you'll draw on all your education and experience as well as the resources of Lockheed Martin to keep the threats at bay. Cyber | Lockheed Martin Who we are: Are you driven by the excitement of harnessing the latest advancements in artificial intelligence, machine learning, and data analytics to revolutionize the way we approach complex challenges? Do you find satisfaction in developing innovative solutions that leverage the power of technology to stay ahead of the curve? If so, join Lockheed Martin's team, where we're pioneering the modernization of technology and pushing the boundaries of what's possible. Our team is dedicated to pioneering the latest advancements and we're looking for someone who shares our commitment to excellence and innovation. Who You Are: Here, you are going to provide life-cycle services to advance mission in support of Cybersecurity and SIGINT midpoint collection. • You will modernize and sustain capabilities to include providing new features and enhancements to Front End Solutions systems. This will allow for products to work at-scale with processing of target communications across multiple types of Midpoint accesses. • You will propel the customer into the next phase of product suite modernization by leveraging advancements in technologies such as containerization, cloud capabilities, dataflows, and Artificial Intelligence/Machine Learning (AI/ML) capabilities. The Work: We are proactively recruiting for a future need, with the expectation of a 2025 or early 2026 start date. This requisition is being used as a placeholder for your interest in one of these roles to bring a modernization mindset to current technology. As the roles open up over time, we will invite you to review the best matching one(s). We are seeking professionals interested in a variety of responsibilities including; • Software development • Software engineering • Systems engineering • Cyber researchers (Capabilities Analyst, etc) • System Security (ISSO / ISSE / ISSM) • Technical writer • Test Engineers • Network Engineers • User Experience Designer • Data Science • Systems Administrator (Linux) For a complete list that changes weekly, use this term to search via the Lockheed Martin jobs website: #RMSRB2025 Come join a company with incredible breadth and depth in the nature of programs and technologies we support that will never leave you bored or looking for your next assignment. Stop having to look for a new job with every rumor and whim of contract changes. Why Join Us: Your Health, Your Wealth, Your Life Our flexible schedules, competitive pay and comprehensive benefits enable you to live a healthy, fulfilling life at and outside of work. Learn more about Lockheed Martin's competitive and comprehensive benefits package. We support our employees, so they can support our mission. Plus, you may be eligible for up to a $25K sign on bonus as an external hire! #RMSC6ISR #OneLMHotJobs #RMSRB2025 Basic Qualifications: • Current DoD Top Secret SCI with Polygraph Seeking multiple levels of Software Engineers: SWE0: • A High School Diploma or GED plus four (4) years of general software engineering experience OR • Bachelor's degree in Computer Science or related discipline from an accredited college or university. SWE1: • A High School Diploma or GED plus eleven (11) years of general software engineering experience OR • A Bachelor's degree in Computer Science or related discipline from an accredited college or university, plus seven (7) years of software engineering experience. SWE2: • A High School Diploma or GED plus eighteen (18) years of general software engineering experience OR • A Bachelor's degree in Computer Science or related discipline from an accredited college or university, plus fourteen (14) years of software engineering experience. SWE3: • A High School Diploma or GED plus twenty-four (24) years of general software engineering experience OR • A Bachelor's degree in Computer Science or related discipline from an accredited college or university, plus twenty (20) years of software engineering experience. Required Skills and Experience: • Analyze user requirements to derive software design and performance requirements • Debug existing software and correct defects • Provide recommendations for improving documentation and software development process standards • Design and code new software or modify existing software to add new features • Integrate existing software into new or modified systems or operating environments • Develop simple data queries for existing or proposed databases or data repositories • Write or review software and system documentation Desired Skills: • Familiar with RF algorithm development Standard Job Description : Designs, develops, documents, tests, and maintains full spectrum cyber solutions. Develops and automates secure systems to support cyber offensive, defense and full spectrum cyber operations. Conducts vulnerability research, reverse engineering, penetration testing (red/blue teams), develops and integrates low-level firmware, and/or develops specialized cyber software solutions and tools based on mission requirements. Applies system knowledge of subject matter (hardware, software, networks, cloud, etc.) to conduct research to evaluate potential vulnerabilities and develop new capabilities to exploit and/or mitigate vulnerabilities. *USE OF THIS CLASSIFICATION REQUIRES AUTHORIZATION FROM THE BUSINESS AREA CYBER DIRECTOR OR DELEGATE* Typical Minimums : Bachelors degree from an accredited college in a related discipline, or equivalent experience/combined education, with 14 years or more of professional experience; or 12 years of professional experience with a related Masters degree. Considered an expert, authority in discipline. Clearance Level: TS/SCI w/Poly SP Other Important Information You Should Know Expression of Interest: By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings. Ability to Work Remotely: Onsite Full-time: The work associated with this position will be performed onsite at a designated Lockheed Martin facility. Work Schedules: Lockheed Martin supports a variety of alternate work schedules that provide additional flexibility to our employees. Schedules range from standard 40 hours over a five day work week while others may be condensed. These condensed schedules provide employees with additional time away from the office and are in addition to our Paid Time off benefits. Schedule for this Position: 9x80 every other Friday off Pay Rate: The annual base salary range for this position in California, Massachusetts, and New York (excluding most major metropolitan areas), Colorado, Hawaii, Illinois, Maryland, Minnesota, New Jersey, Vermont, Washington or Washington DC is $150,800 - $265,880. For states not referenced above, the salary range for this position will reflect the candidate's final work location. Please note that the salary information is a general guideline only. Lockheed Martin considers factors such as (but not limited to) scope and responsibilities of the position, candidate's work experience, education/ training, key skills as well as market and business considerations when extending an offer. Benefits offered: Medical, Dental, Vision, Life Insurance, Short-Term Disability, Long-Term Disability, 401(k) match, Flexible Spending Accounts, EAP, Education Assistance, Parental Leave, Paid time off, and Holidays. (Washington state applicants only) Non-represented full-time employees: accrue at least 10 hours per month of Paid Time Off (PTO) to be used for incidental absences and other reasons; receive at least 90 hours for holidays. Represented full time employees accrue 6.67 hours of Vacation per month; accrue up to 52 hours of sick leave annually; receive at least 96 hours for holidays. PTO, Vacation, sick leave, and holiday hours are prorated based on start date during the calendar year. This position is incentive plan eligible. Lockheed Martin is an equal opportunity employer. Qualified candidates will be considered without regard to legally protected characteristics. The application window will close in 90 days; applicants are encouraged to apply within 5 - 30 days of the requisition posting date in order to receive optimal consideration. Join us at Lockheed Martin, where your mission is ours. Our customers tackle the hardest missions. Those that demand extraordinary amounts of courage, resilience and precision. They're dangerous. Critical. Sometimes they even provide an opportunity to change the world and save lives. Those are the missions we care about. As a leading technology innovation company, Lockheed Martin's vast team works with partners around the world to bring proven performance to our customers' toughest challenges. Lockheed Martin has employees based in many states throughout the U.S., and Internationally, with business locations in many nations and territories. Experience Level: Experienced Professional Business Unit: RMS Relocation Available: Possible Career Area: Systems Engineering: Software Type: Task Order/IDIQ Shift: First
    $76k-100k yearly est. 22h ago
  • Assoc Engineer

    Exelon 4.8company rating

    Data engineer job in Washington, DC

    Who We Are: We're powering a cleaner, brighter future. Exelon is leading the energy transformation, and we're calling all problem solvers, innovators, community builders and change makers. Work with us to deliver solutions that make our diverse cities and communities stronger, healthier and more resilient. We're powered by purpose-driven people like you who believe in being inclusive and creative, and value safety, innovation, integrity and community service. We are a Fortune 200 company, 19,000 colleagues strong serving more than 10 million customers at six energy companies -- Atlantic City Electric (ACE), Baltimore Gas and Electric (BGE), Commonwealth Edison (ComEd), Delmarva Power & Light (DPL), PECO Energy Company (PECO), and Potomac Electric Power Company (Pepco). In our relentless pursuit of excellence, we elevate diverse voices, fresh perspectives and bold thinking. And since we know transforming the future of energy is hard work, we provide competitive compensation, incentives, excellent benefits and the opportunity to build a rewarding career. Are you in? Primary Purpose: Assists experienced engineers in developing studies, plans, criteria, specifications, calculations, evaluations, design documents, performance assessments, integrated systems analysis, cost estimates, budgets, associated with the planning, design, licensing, construction, operation, and maintenance of Exelon's electric generation, transmission, distribution, gas and telecommunication facilities/systems. Provides analytical support for consultation and recommendations to the Company within and to other business units and/or customers as a result of studying company or customer-owned systems, processes, outages, equipment, vehicles or facilities to advance business needs and efficiencies. Develop recommendations to improve planning, design, installation and maintenance processes. Collects and compiles financial data for budget and actual costs of projects. Position may be required to work extended hours, including 24 x 7 coverage during storms or other energy delivery emergencies. Note: This is a hybrid position (in-office with remote flexibility). Employees are required to be in office at least three days per week (Tuesday, Wednesday, and Thursday). This position must sit out of our Philadelphia -PA, Kennett Square - PA, Newark - DE and Washington - DC office. This position is not eligible for relocation assistance. Primary Duties: Assists experienced engineers in performing well-defined engineering assignments in specialized areas requiring engineering expertise exercising independent discretion. (e.g. Collect data, perform complex analysis, interpret results, draw conclusions, and clearly present a recommendation to management) Provides support to engineering and operating groups to analyze specific design, installation and maintenance activities. Assists in the performance of complex engineering assignments.(e.g. Analyze and interpret the results of complex power flows and perform complex electrical tests, and analyze non-specific and ambiguous results) Assists with technical consultation, plan developments and recommendations for customer and company systems and processes; (e.g. Verify and validate studies, blueprints, or designs against accepted engineering principles and practices) Evaluates effectiveness of current technical systems and processes. Participates on teams; (e.g. Design high voltage transmission and distribution circuits, meeting all engineering standards and criteria) Job Scope: Modeling & Scenario Planning develops and maintains system models required to run studies on the transmission system to ensure adherence to reliability criteria. This includes, but is not limited to, running thermal and voltage steady state powerflow studies and maintaining modeling data (connectivity, impedances, ratings, contingency information, etc.). As part of that task, the Team analyzes the risk on the transmission system both in the short term and future years, which provides critical information to anticipate unidentified issues and prioritize existing risks. The successful candidate will perform Transmission System Modeling and Power Flow Studies/Analysis of the Exelon transmission system. These efforts will include: Developing models and performing studies using available tools such as PSS/E and TARA In coordination with PJM, ensuring that Exelon complies with applicable NERC Standards and PJM Operations/Planning Manuals Leveraging experience performing power flow analysis and validation of study results Demonstrating an understanding of OpCo Transmission System(s), including Substation and Protection Design and Technical Standards Utilizing experience with Transmission modeling characteristics such as Ratings and Impedances to ensure proper results Providing Innovative Solutions for enhancing Planning Techniques and integrating New Technologies Minimum Qualifications: Bachelor of Science degree in Engineering Ability to analyze and interpret complex electrical and mechanical systems. Knowledge and ability to apply problem solving approaches and engineering theory. Knowledge of engineering designs, principles and practices. Zero to two years of professional engineering experience Limited knowledge and experience with regulations, guides, standards, codes, methods, and practices necessary to perform routine assignments for a specific discipline installations, or service. Preferred Qualifications: Written and oral communication/presentation skills, report generation & technical writing skills Interpersonal skills & the ability to collaborate with peers Ability to confer with customers and identify customer needs High level understanding of power systems and/or transmission system power flow Benefits: Annual salary will vary based on a candidate's skills, qualifications, experience, and other factors: $71,200.00/Yr. - $97,900.00/Yr. Annual Bonus for eligible positions: 7% 401(k) match and annual company contribution Medical, dental and vision insurance Life and disability insurance Generous paid time off options, including vacation, sick time, floating and fixed holidays, maternity leave and bonding/primary caregiver leave or parental leave Employee Assistance Program and resources for mental and emotional support Wellbeing programs such as tuition reimbursement, adoption and surrogacy assistance and fitness reimbursement Referral bonus program And much more Note: Exelon-sponsored compensation and benefit programs may vary or not apply based on length of service, job grade, job classification or represented status. Eligibility will be determined by the written plan or program documents.
    $71.2k-97.9k yearly Auto-Apply 3d ago
  • Senior CNO Developer

    Mantech 4.5company rating

    Data engineer job in Annapolis, MD

    MANTECH seeks a motivated, career and customer-oriented Senior CNO Developer to join our team in Annapolis Junction, Maryland. We're looking for a Senior Capability Developer to join our elite team. In this role, you'll apply your deep technical expertise to analyze, reverse-engineer, and develop mission-critical capabilities that directly support national security objectives. You will be a key player in a fast-paced environment, tackling unique challenges at the intersection of hardware, software, and embedded systems. Responsibilities include but are not limited to: Develop custom software tools and applications using Python, C, and Assembly, focusing on embedded and resource-constrained systems. Conduct rigorous code reviews to ensure the quality, security, and performance of developed software. Reverse engineer complex hardware and software systems to understand their inner workings and identify potential vulnerabilities. Perform in-depth vulnerability research to discover and analyze weaknesses in a variety of targets. Collaborate with a team of skilled engineers to design and implement innovative solutions to challenging technical problems. Minimum Qualifications: Bachelor's degree and 12 years of experience; or, a high school diploma with 16 years of experience; or, an Associate's degree with 14 years of experience. A Master's degree may substitute for 2 years of experience, and a PhD may substitute for 4 years of experience. Must have 7 years of position-relevant work experience Proficiency in programming and application development. Strong scripting skills, particularly in Python, C, and Assembly. Deep expertise in managing, configuring, and troubleshooting Linux. Experience in embedded systems. Experience in reverse engineering and vulnerability research of hardware and software. Experience in code review. Preferred Qualifications: Experience in CNO (Computer Network Operations) Development. Experience in virtualization. Knowledge of IoT (Internet of Things) devices. Experience with Linux Kernel development and sockets. Knowledge of integrating security tools into the CI/CD (Continuous Integration/Continuous Delivery) pipeline. Networking skills. Clearance Requirements: Must have a current/active Top Secret/SCI clearance. Physical Requirements: The person in this position must be able to remain in a stationary position 50% of the time. Occasionally move about inside the office to access file cabinets, office machinery, or to communicate with co-workers, management, and customers, via email, phone, and or virtual communication, which may involve delivering presentations
    $85k-109k yearly est. 2d ago
  • Data Engineer

    Guidehouse 3.7company rating

    Data engineer job in Arlington, VA

    Job Family: Data Science Consulting Travel Required: None Clearance Required: Ability to Obtain Secret What You Will Do: Guidehouse is seeking an experienced Data Engineer to join our Technology AI and Data practice with a dedicated focus on military clients in the Defense & Security segment. This individual will have a strong data engineering background and be a hands-on technical contributor, responsible for designing, implementing, and maintaining scalable data pipelines and interactive dashboards than enable federal clients to achieve mission outcomes, operational efficiency, and digital transformation. This is an exciting opportunity for someone who thrives at the intersection of data engineering, cloud platforms, and public sector modernization. The Data Engineer will collaborate with cross-functional teams and client stakeholders to modernize legacy environments, implement scalable data pipelines, and support advanced analytics initiatives for our federal client. Client Leadership & Delivery Collaborate with military clients to understand data architecture, data pipeline and reporting requirements. Lead the development of ETL pipelines and dashboard integrations using Databricks and Microsoft Power Platform tools (i.e. Power BI). Ensure delivery excellence and measurable outcomes across data migration and visualization efforts. Ensure solutions support auditability, internal controls, and statutory compliance requirements. This includes strict financial data controls. Solution Development & Innovation Design and implement scalable ETL/ELT pipelines using Databricks, SQL, and Python. Develop and optimize dashboards aligned with federal reporting standards, as needed. Apply AI/ML tools to automate metadata extraction, clustering, and dashboard scripting. Optimize data pipelines for performance with large data sets for both analytics and transactional load use cases (e.g., OLAP, OLTP). Enable downstream BI via clean data. Ensure compliance with federal data governance, security, and performance standards. Design and document enterprise data models, metadata strategies, data lineage frameworks, and other relevant documentation, as needed. Align data from multiple discrete data sets into a cohesive, interoperable architecture, identifying opportunities for linkages between datasets, normalization, field standardization, etc. Assist with cleanup of existing data and models, including use of ETL. Practice & Team Leadership Work closely with data architects, analysts, and cloud engineers to deliver integrated solutions. Support documentation, testing, and deployment of data products. Mentor junior team members and contribute to reusable frameworks and accelerators. Contribute to thought leadership and best practice development across the AI & Data team. What You Will Need: Must be able to OBTAIN and MAINTAIN a Federal or DoD "SECRET" security clearance; candidates must obtain approved adjudication of clearance prior to onboarding with Guidehouse. Candidates with an ACTIVE "SECRET" or higher-level clearance are preferred. Bachelor's degree is required Minimum TWO (2) years of experience in data engineering and dashboard development. Proven experience with Databricks, business intelligence tools, and cloud platforms (AWS, Azure.) Strong proficiency in SQL, Python, and Spark. Experience building ETL pipelines and integrating data sources into reporting platforms. Familiarity with data governance, metadata, and compliance frameworks. Excellent communication, facilitation, and stakeholder engagement skills. What Would Be Nice To Have: Experience with ADVANA. Databricks Data Engineer Associate or Professional certification Experience working with military clients. Experience working with financial management data. Familiarity with federal contracting and procurement processes. What We Offer: Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace. Benefits include: Medical, Rx, Dental & Vision Insurance Personal and Family Sick Time & Company Paid Holidays Position may be eligible for a discretionary variable incentive bonus Parental Leave and Adoption Assistance 401(k) Retirement Plan Basic Life & Supplemental Life Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts Short-Term & Long-Term Disability Student Loan PayDown Tuition Reimbursement, Personal Development & Learning Opportunities Skills Development & Certifications Employee Referral Program Corporate Sponsored Events & Community Outreach Emergency Back-Up Childcare Program Mobility Stipend About Guidehouse Guidehouse is an Equal Opportunity Employer-Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation. Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco. If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at ************** or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation. All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or ************************. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process. If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse's Ethics Hotline. If you want to check the validity of correspondence you have received, please contact *************************. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant's dealings with unauthorized third parties. Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.
    $73k-97k yearly est. Auto-Apply 1d ago
  • Engineer

    Exelon 4.8company rating

    Data engineer job in Washington, DC

    Who We Are: We're powering a cleaner, brighter future. Exelon is leading the energy transformation, and we're calling all problem solvers, innovators, community builders and change makers. Work with us to deliver solutions that make our diverse cities and communities stronger, healthier and more resilient. We're powered by purpose-driven people like you who believe in being inclusive and creative, and value safety, innovation, integrity and community service. We are a Fortune 200 company, 19,000 colleagues strong serving more than 10 million customers at six energy companies -- Atlantic City Electric (ACE), Baltimore Gas and Electric (BGE), Commonwealth Edison (ComEd), Delmarva Power & Light (DPL), PECO Energy Company (PECO), and Potomac Electric Power Company (Pepco). In our relentless pursuit of excellence, we elevate diverse voices, fresh perspectives and bold thinking. And since we know transforming the future of energy is hard work, we provide competitive compensation, incentives, excellent benefits and the opportunity to build a rewarding career. Are you in? Primary Purpose: Develops studies, plans, criteria, specifications, calculations, evaluations, design documents, performance assessments, integrated systems analysis, cost estimates, budgets, associated with the planning, design, licensing, construction, operation, and maintenance of Exelon's electric generation, transmission, distribution, gas and telecommunication facilities/systems under the guidance of an experienced engineer. Provides consultation and recommendations to the Company within and to other business units and/or customers as a result of studying company or customer-owned systems, processes, equipment, vehicles or facilities under an experienced engineer. Reviews financial data from budget and actual costs of projects under the guidance of an experienced engineer. Position may be required to work extended hours for coverage during storms or other energy delivery emergencies. Note: This is a hybrid position (in-office with remote flexibility). Employees are required to be in office at least three days per week (Tuesday, Wednesday, and Thursday). This position must sit out of our Philadelphia -PA, Kennett Square - PA, Washington - DC or Newark - DE office. This position is not eligible for relocation assistance. Primary Duties: Performs engineering assignments while exercising independent discretion under the guidance of an experienced engineer. (e.g. Collect data, perform complex analysis, interpret results, draw conclusions, and clearly present a recommendation to management) Performs engineering tasks associated with large projects or a number of small projects. (e.g. Analyze and interpret the results of complex power flows and perform complex engineering tests, and analyze non-specific and ambiguous results) May direct the engineering tasks associated with a large project or a number of small projects (e.g. Verify and validate studies, blueprints, or designs against accepted engineering principles and practices. Design high voltage transmission and distribution circuits, meeting all engineering standards and criteria) Participates on teams and may lead teams. Job Scope: Modeling & Scenario Planning develops and maintains system models required to run studies on the transmission system to ensure adherence to reliability criteria. This includes, but is not limited to, running thermal and voltage steady state powerflow studies and maintaining modeling data (connectivity, impedances, ratings, contingency information, etc.). As part of that task, the Team analyzes the risk on the transmission system both in the short term and future years, which provides critical information to anticipate unidentified issues and prioritize existing risks. The successful candidate will perform Transmission System Modeling and Power Flow Studies/Analysis of the Exelon transmission system. These efforts will include: Developing models and performing studies using available tools such as PSS/E and TARA In coordination with PJM, ensuring that Exelon complies with applicable NERC Standards and PJM Operations/Planning Manuals Leveraging experience performing power flow analysis and validation of study results Demonstrating an understanding of OpCo Transmission System(s), including Substation and Protection Design and Technical Standards Utilizing experience with Transmission modeling characteristics such as Ratings and Impedances to ensure proper results Providing Innovative Solutions for enhancing Planning Techniques and integrating New Technologies Minimum Qualifications: Bachelor of Science degree in Engineering 2 - 4 years of professional engineering experience Ability to analyze and interpret complex electrical and mechanical systems. Knowledge and ability to apply problem solving approaches and engineering theory. Knowledge of engineering designs, principles and practices. General knowledge and experience with regulations, guides, standards, codes, methods, and practices necessary to perform assignments for a specific discipline, various installations, or services Preferred Qualifications: Engineer in Training License Strong written and oral communication/presentation skills, report generation & technical writing skills Interpersonal skills & the ability to collaborate with peers and managers Time, project management and multi-tasking skills Ability to analyze industry wide trends and implement enhancements A working knowledge of analysis software packages such as CYMDIST, PSS\E, TARA, Python, PSCAD, MATLAB, etc. to perform and analyze load flow modeling, contingency studies, and transfer analysis Benefits: Annual salary will vary based on a candidate's skills, qualifications, experience, and other factors: $83,200.00/Yr. - $114,400.00/Yr. Annual Bonus for eligible positions: 10% 401(k) match and annual company contribution Medical, dental and vision insurance Life and disability insurance Generous paid time off options, including vacation, sick time, floating and fixed holidays, maternity leave and bonding/primary caregiver leave or parental leave Employee Assistance Program and resources for mental and emotional support Wellbeing programs such as tuition reimbursement, adoption and surrogacy assistance and fitness reimbursement Referral bonus program And much more Note: Exelon-sponsored compensation and benefit programs may vary or not apply based on length of service, job grade, job classification or represented status. Eligibility will be determined by the written plan or program documents.
    $83.2k-114.4k yearly Auto-Apply 3d ago
  • Data Scientist

    Us Tech Solutions 4.4company rating

    Data engineer job in Washington, DC

    Duration: 6 Months with possibly extension AI Engineer 1. Background and Context The AI Engineer will play a pivotal role in designing, developing, and deploying artificial intelligence solutions that enhance operational efficiency, automate decision-making, and support strategic initiatives for the environmental and social specialists in the Bank. This role is central to the VPU's digital transformation efforts and will contribute to the development of scalable, ethical, and innovative AI systems. 2. Qualifications and Experience Education Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or related field. Experience Minimum 3 years of experience in AI/ML model development and deployment. Experience with MLOps tools (e.g., MLflow), Docker, and cloud platforms (AWS, Azure, GCP). Proven track record in implementing LLMs, RAG, NLP model development and GenAI solutions. Technical Skills Skilled in - Azure AI/Google Vertex Search, Vector Databases, fine-tuning the RAG, NLP model development, API Management (facilitates access to different sources of data) Proficiency in Python, TensorFlow, PyTorch, and NLP frameworks. Expertise deep learning, computer vision, and large language models. Familiarity with REST APIs, NoSQL, and RDBMS. Soft Skills Strong analytical and problem-solving abilities. Excellent communication and teamwork skills. Strategic thinking and innovation mindset. 3. Certifications (Preferred) Microsoft Certified: Azure AI Engineer Associate Google Machine Learning Engineer SAFe Agile Software Engineer (ASE) Certification in AI Ethics 4. Objectives of the Assignment Develop and implement AI models and algorithms tailored to business needs. Integrate AI solutions into existing systems and workflows. Ensure ethical compliance and data privacy in all AI initiatives. Support user adoption through training and documentation. Support existing AI solutions by refinement, troubleshooting, and reconfiguration 5. Scope of Work and Responsibilities AI Solution Development Collaborate with cross-functional teams to identify AI opportunities. Train, validate, and optimize machine learning models. Translate business requirements to technical specifications. AI Solution Implementation Develop code, deploy AI models and into production environments, and conduct ongoing model training Monitor performance and troubleshoot issues and engage in fine-tuning the solutions to improve accuracy Ensure compliance with ethical standards and data governance policies. User Training and Adoption Conduct training sessions for stakeholders on AI tools. Develop user guides and technical documentation. Data Analysis and Research Collect, preprocess, and engineer large datasets for machine learning and AI applications. Recommend and Implement Data Cleaning and Preparation Analyze and use structured and unstructured data (including geospatial data) to extract features and actionable insights. Monitor data quality, detect bias, and manage model/data drift in production environments. Research emerging AI technologies and recommend improvements. Governance, Strategy, Support, and Maintenance Advise client Staff on AI strategy and policy implications Contribute to the team's AI roadmap and innovation agenda. Provide continuous support and contribute towards maintenance and future enhancements. 4. Deliverables Work on Proof of Concepts to study the technical feasibility of AI Use Cases Functional AI applications integrated into business systems. Documentation of model/application architecture, training data, and performance metrics. Training materials and user guides. Develop, train, and deploy AI models tailored to business needs. About US Tech Solutions: US Tech Solutions is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit ************************ US Tech Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Recruiter Details: Name: Pooja Rani Email: ****************************** Internal Id: 25-53638
    $75k-110k yearly est. 3d ago
  • Data Scientist

    Bestinfo Systems LLC

    Data engineer job in Columbia, MD

    Data Scientist - Transit Data Focus_Columbia, MD (On-site / hybrid)_Contract (6 Months) Data Scientist - Transit Data Focus Employment type: Contract Duration: 6 Months Justification: To manage and analyze customer databases, AVA (automated voice announcement), and schedule data for predictive maintenance and service planning. Experience Level: 3-5 years Job Responsibilities: Collect, process, and analyze transit-related datasets including customer databases, AVA (automated voice announcement) logs, real-time vehicle data, and schedule data. Develop predictive models and data-driven insights to support maintenance forecasting, service planning, and operational optimization. Design and implement data pipelines to integrate, clean, and transform large, heterogeneous transit data sources. Perform statistical analysis and machine learning to identify patterns, trends, and anomalies relevant to transit service performance and reliability. Collaborate with transit planners, maintenance teams, and IT staff to translate data insights into actionable business strategies. Monitor data quality and integrity; implement data validation and cleansing processes. Technical Skills & Qualifications: Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Transportation Engineering, or a related quantitative field. 3-5 years of experience working as a data scientist or data analyst, preferably in a transit, transportation, or public sector environment. Strong proficiency in Python or R for data analysis, statistical modeling, and machine learning. Experience with SQL for database querying, manipulation, and data extraction. Familiarity with transit data standards such as GTFS, AVL/CAD, APC (Automated Passenger Counters), and AVA systems. Experience with data visualization tools such as Power BI, or equivalent.
    $73k-103k yearly est. 1d ago
  • Data Engineer

    Pyramid Consulting, Inc. 4.1company rating

    Data engineer job in McLean, VA

    Immediate need for a talented Data Engineer. This is a 12 months contract opportunity with long-term potential and is located in Mclean, VA(Hybrid). Please review the job description below and contact me ASAP if you are interested. Job ID: 25-93504 Pay Range: $70 - $75/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location). Key Responsibilities: Design, develop, and maintain data pipelines leveraging Python, Spark/PySpark, and cloud-native services. Build and optimize data workflows, ETL processes, and transformations for large-scale structured and semi-structured datasets. Write advanced and efficient SQL queries against Snowflake, including joins, window functions, and performance tuning. Develop backend and automation tools using Golang and/or Python as needed. Implement scalable, secure, and high-quality data solutions across AWS services such as S3, Lambda, Glue, Step Functions, EMR, and CloudWatch. Troubleshoot complex production data issues, including pipeline failures, data quality gaps, and cloud environment challenges. Perform root-cause analysis and implement automation to prevent recurring issues. Collaborate with data scientists, analysts, platform engineers, and product teams to enable reliable, high-quality data access. Ensure compliance with enterprise governance, data quality, and cloud security standards. Participate in Agile ceremonies, code reviews, and DevOps practices to ensure high engineering quality. Key Requirements and Technology Experience: Skills-Data Engineer- Python , Spark/PySpark, AWS, Golang, Able to write complex SQL queries against Snowflake tables / Troubleshoot issues, Java/Python, AWS (Glue, EC2, Lambda). Proficiency in Python with experience building scalable data pipelines or ETL processes. Strong hands-on experience with Spark/PySpark for distributed data processing. Experience writing complex SQL queries (Snowflake preferred), including optimization and performance tuning. Working knowledge of AWS cloud services used in data engineering (S3, Glue, Lambda, EMR, Step Functions, CloudWatch, IAM). Experience with Golang for scripting, backend services, or performance-critical processes. Strong debugging, troubleshooting, and analytical skills across cloud and data ecosystems. Familiarity with CI/CD workflows, Git, and automated testing. Our client is a leading Banking and Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration. Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
    $70-75 hourly 3d ago
  • Data Scientist with ML

    Kavaliro 4.2company rating

    Data engineer job in Reston, VA

    Kavaliro is seeking a Data Scientist to provide highly technical and in-depth data engineering support. MUST have experience with Python, PyTorch, Flask (knowledge at minimum with ability to quickly pickup), Familiarity with REST APIs (at minimum), Statistics background/experience, Basic understanding of NLP. Desired skills for a candidate include experience performance R&D with natural language processing, deploying CNN and LLMs or foundational models, deploying ML models on multimedia data, experience with Linux System Administration (or bash), experience with Android Configuration, experience in embedded systems (Raspberry Pi). Required Skills and Demonstrated Experience Demonstrated experience in Python, Javascript, and R. Demonstrated experience employing machine learning and deep learning modules such as Pandas, Scikit, Tensorflow, Pytorch. Demonstrated experience with statistical inference, as well as building and understanding predictive models, using machine learning methods. Demonstrated experience with large-scale text analytics. Desired Skills Demonstrated hands-on experience performing research or development with natural language processing and working with, deploying, and testing Convolutional Neural Networks (CNN), large-language models (LLMs) or foundational models. Demonstrated experience developing and deploying testing and verification methodologies to evaluate algorithm performance and identify strategies for improvement or optimization. Demonstrated experience deploying machine learning models on multimedia data, to include joint text, audio, video, hardware, and peripherals. Demonstrated experience with Linux System Administration and associated scripting languages (Bash) Demonstrated experience with Android configuration, software development, and interfacing. Demonstrated experience in embedded systems (Raspberry Pi) Develops and conducts independent testing and evaluation methods on research-grade algorithms in applicable fields. Reports results and provide documentation and guidance on working with the research-grade algorithms. Evaluates, Integrates and leverage internally-hosted data science tools. Customize research grade algorithms to be optimized for memory and computational efficiency through quantizing, trimming layers, or through custom methods Location: Reston, Virginia This position is onsite and there is no remote availability. Clearance: Active TS/SCI with Full Scope Polygraph Applicant MUST hold a permanent U.S. citizenship for this position in accordance with government contract requirements. Kavaliro provides Equal Employment Opportunities to all employees and applicants. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Kavaliro is committed to the full inclusion of all qualified individuals. In keeping with our commitment, Kavaliro will take the steps to assure that people with disabilities are provided reasonable accommodations. Accordingly, if reasonable accommodation is required to fully participate in the job application or interview process, to perform the essential functions of the position, and/or to receive all other benefits and privileges of employment, please respond to this posting to connect with a company representative.
    $74k-105k yearly est. 2d ago
  • Azure Data Modeler

    Dexian

    Data engineer job in Washington, DC

    Azure Data Modeler - Budget Transformation Project Our client is embarking on a major budget transformation initiative and is seeking an experienced Azure Data Modeler to support data architecture, modeling, and migration activities. This role will play a critical part in designing and optimizing data structures as the organization transitions to SAP. Experience with SAP is preferred, but strong ERP data experience in any platform is also valuable. Responsibilities Design, develop, and optimize data models within the Microsoft Azure environment. Support data architecture needs across the budget transformation program. Partner with cross-functional stakeholders to enable the transition to SAP (or other ERP systems). Participate in data migration planning, execution, and validation efforts. Work collaboratively within SAFe Agile teams and support sprint activities. Provide off-hours support as needed for critical tasks and migration windows. Engage onsite in Washington, DC up to three days per week. Required Qualifications Strong hands-on expertise in data architecture and data model design. Proven experience working with Microsoft Azure (core requirement). Ability to work flexibly, including occasional off-hours support. Ability to be onsite in Washington, DC as needed (up to 3 days/week). Preferred Qualifications Experience with SAP ECC or exposure to SAP implementations. Experience with other major ERP systems (Oracle, Workday, etc.). SAFe Agile certification. Dexian stands at the forefront of Talent + Technology solutions with a presence spanning more than 70 locations worldwide and a team exceeding 10,000 professionals. As one of the largest technology and professional staffing companies and one of the largest minority-owned staffing companies in the United States, Dexian combines over 30 years of industry expertise with cutting-edge technologies to deliver comprehensive global services and support. Dexian connects the right talent and the right technology with the right organizations to deliver trajectory-changing results that help everyone achieve their ambitions and goals. To learn more, please visit ******************** Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
    $81k-111k yearly est. 2d ago
  • Senior Data Engineer - Data Intelligence

    Altak Group Inc.

    Data engineer job in Baltimore, MD

    Hybrid Role Job Title: Data Intelligence - Engineer, Data Sr Project Type: Contract Duration of the project: 12 Months We are looking for candidates with 5+ years' experience in Ab Initio Administration. (Internal Notes: Please do not send Developers) Must have 2+ years' experience with AWS. Working with Ab Initio in AWS Cloud. Must have solid experience building, installing and configuring Ab Initio. Must have AWS EKS containerization. Will be involved moving Linux instances to AWS EKS. Ab Initio Lead Infrastructure This position is an Ab Initio Administrator and not a developer position. The Senior Ab Initio ETL Administrator is responsible for the tasks involved in administration of ETL tool (Ab-Initio) as well as migrating Ab Initio infrastructure to the Cloud. The candidate will support the implementation of a Data Integration/Data Warehouse for the Data products on-prem and in AWS Cloud like EKS containerization for Ab Initio. 6 - 8 Years' Experience At least 6 years of Experienced with all the tasks involved in administration of ETL Tool (Ab Initio) Experienced with managing the project of migration or infrastructure build without supervisor At least 6 years of Experienced with Advance knowledge of Ab Initio Graphical Development Environment (GDE), Meta Data Hub, Operational Console Experience with Ab Initio, AWS EKS, S3, Dynamo DB, Mongo DB, ProgreSQL, RDS, DB2 Created Big Data pipelines (ETL) from on-premises to Data Factories, Data Lakes, and Cloud Storage such as EBS or S3. DevOps (CI/CD Pipeline) prefers Jenkins experience Experience with Advance knowledge of UNIX and SQL Experience with manage metadata hub-MDH, Operational Console and troubleshoot environmental issues which affect these components Experience with scripting and automation such as design and develop automated ETL process and architecture and unit testing of the ETL code Experience with working on the break fix and continuous development items, review, and inspection for the production changes Perform the code review for the ETL code developed by the development team and guide to resolve an issue. Service Oriented Architecture (SOA) knowledge and Demonstrated knowledge and best practices of testing environments and processes Demonstrated experience working in an Enterprise environment with crossed team interaction and collaboration and policies Strong testing skills Excellent problem-solving skills Strong analytical skills Excellent verbal and written communications skills Familiar with structured programming techniques Must be able to perform assigned tasks with minimum supervision Strong documentation skills Experience working in an Agile environment is a plus Software: Applies and implements best practices for data auditing, scalability, reliability, and application performance. AWS certification is a plus Extensive UNIX AIX or Linux and Scripting experience Extensive SDLC experience with some development or Systems programming experience Ability to analyze and trouble-shoot Mid-tier/infrastructure issues. Very strong verbal and written communication skills (Critical) Ability to facilitate technical requirements gathering and design sessions Collaborate and interpret business and technical needs Excellent attention to detail and quality work products (Critical) Strong customer service skills with internal and external customers (Critical) Must be able to perform assigned tasks with minimum supervision (Critical) Strong analytical and documentation skills Excellent time management ability. (Critical) Skills Preferred Experience with DEVOPS or IAAS AIX or Linux LDAP EIAM (Identity Access Management) Ab Initio Admin and Architect
    $81k-111k yearly est. 4d ago
  • Data Architect

    Seneca Resources 4.6company rating

    Data engineer job in Arlington, VA

    • Functions as the primary technical architect for data warehousing projects to solve business intelligence challenges • Possesses deep technical expertise in database design, ETL (OWB/ODI), reporting, and analytics • Previous consulting experience utilizing an agile delivery methodology Position Requirements • Solutions architect must have expertise as both a solutions architect and AI architect. • 3+ years experience with Azure ETL processing • 3+ years experience utilizing data warehousing methodologies and processes • Strong conceptual, analytical, and decision-making skills • Knowledge and Experience of dimensional modeling • Strong knowledge of Azure Databricks • Proficiency in creating PL/SQL packages • Full SDLC and Data Modeling experience • Ability to create both logical and physical data models • Ability to tune databases for maximum performance • Experience in Data Preparation: Data Profiling, Data Cleansing, and Data Auditing • Ability to work with Business Analysts to create functional specifications and data • Manages QA functions • Develops unit, system, and integration test plans and manages execution • Ability to write technical and end-user system documentation • Excellent written and oral communication skills • Experience transforming logical business requirements into appropriate schemas and models • Ability to analyze and evaluate moderate to highly complex information systems by being able to interpret such devices as Entity Relation Diagrams, data dictionaries, record layouts, and logic flow diagrams
    $103k-140k yearly est. 4d ago
  • Lead Data Engineer

    Intellibus

    Data engineer job in Reston, VA

    Imagine working at Intellibus to engineer platforms that impact billions of lives around the world. With your passion and focus we will accomplish great things together! Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry. Are you a data enthusiast with a natural ability for analytics? We're looking forward skilled Data/Analytics Engineers to fill multiple roles for our exciting new client. This is your chance to shine, demonstrating your dedication and commitment in a role that promises both challenge and reward. What We Offer: A dynamic environment where your skills will make a direct impact. The opportunity to work with cutting-edge technologies and innovative projects. A collaborative team that values your passion and focus. We are looking for Engineers who can Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake. Implement ETL (Extract, Transform, Load) processes using Snowflake's features such as Snowpipe, Streams, and Tasks. Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs. Optimize data warehouse performance and scalability using Snowflake features like clustering, partitioning, and materialized views. Integrate Snowflake with external systems and data sources, including on-premises databases, cloud storage, and third-party APIs. Implement data synchronization processes to ensure consistency and accuracy of data across different systems. Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management features. Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency. Work on Snowflake modeling - roles, databases, schemas, ETL tools with cloud-driven skills Work on SQL performance measuring, query tuning, and database tuning Handle SQL language and cloud-based technologies Set up the RBAC model at the infra and data level. Work on Data Masking / Encryption / Tokenization, Data Wrangling / Data Pipeline orchestration (tasks). Setup AWS S3/EC2, Configure External stages, and SQS/SNS Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks) Key Skills & Qualifications: ETL - Experience with ETL processes for data integration. SQL - Strong SQL skills for querying and data manipulation Python - Strong command of Python, especially in AWS Boto3, JSON handling, and dictionary operations Unix - Competent in Unix for file operations, searches, and regular expressions AWS - Proficient with AWS services including EC2, Glue, S3, Step Functions, and Lambda for scalable cloud solutions Database Modeling - Solid grasp of database design principles, including logical and physical data models, and change data capture (CDC) mechanisms. Snowflake - Experienced in Snowflake for efficient data integration, utilizing features like Snowpipe, Streams, Tasks, and Stored Procedures. Airflow - Fundamental knowledge of Airflow for orchestrating complex data workflows and setting up automated pipelines Bachelor's degree in Computer Science, or a related field is preferred. Relevant work experience may be considered in lieu of a degree. Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and stakeholders. Proven leadership abilities, with experience mentoring junior developers and driving technical excellence within the team. We work closely with Data Wrangling ETL Talend Jasper Java Python Unix AWS Data Warehousing Data Modeling Database Migration RBAC model Data migration Our Process Schedule a 15 min Video Call with someone from our Team 4 Proctored GQ Tests (< 2 hours) 30-45 min Final Video Interview Receive Job Offer If you are interested in reaching out to us, please apply and our team will contact you within the hour.
    $79k-108k yearly est. 2d ago
  • Cloud Data Engineer- Databricks

    Infocepts 3.7company rating

    Data engineer job in McLean, VA

    Purpose: We are seeking a highly skilled Cloud Data Engineer with deep expertise in Databricks and modern cloud platforms such as AWS, Azure, or GCP. This role is ideal for professionals who are passionate about building next-generation data platforms, optimizing complex data workflows, and enabling advanced analytics and AI in cloud-native environments. You'll have the opportunity to work with Fortune-500 organizations in data and analytics, helping them unlock the full potential of their data through innovative, scalable solutions. Key Result Areas and Activities: Design and implement robust, scalable data engineering solutions. Build and optimize data pipelines using Databricks, including serverless capabilities, Unity Catalog, and Mosaic AI. Collaborate with analytics and AI teams to enable real-time and batch data workflows. Support and improve cloud-native data platforms (AWS, Azure, GCP). Ensure adherence to best practices in data modeling, warehousing, and governance. Contribute to automation of data workflows using CI/CD, DevOps, or DataOps practices. Implement and maintain workflow orchestration tools like Apache Airflow and dbt. Roles & Responsibilities Essential Skills 4+ years of experience in data engineering with a focus on scalable solutions. Strong hands-on experience with Databricks in a cloud environment. Proficiency in Spark and Python for data processing. Solid understanding of data modeling, data warehousing, and architecture principles. Experience working with at least one major cloud provider (AWS, Azure, or GCP). Familiarity with CI/CD pipelines and data workflow automation. Desirable Skills Direct experience with Unity Catalog and Mosaic AI within Databricks. Working knowledge of DevOps/DataOps principles in a data engineering context. Exposure to Apache Airflow, dbt, and modern data orchestration frameworks. Qualifications Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field. Relevant certifications in cloud platforms (AWS/Azure/GCP) or Databricks are a plus. Qualities: Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team Able to iterate based on new information, peer reviews, and feedback Able to work seamlessly with clients across multiple geographies Research focused mindset Excellent analytical, presentation, reporting, documentation and interactive skills "Infocepts is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law."
    $77k-105k yearly est. 5d ago
  • Junior Data Scientist (TS/SCI)

    Take2 Consulting, LLC 3.7company rating

    Data engineer job in Springfield, VA

    We are seeking a junior-level Data Science professional with a strong academic foundation and early hands-on experience to join our team as a Exploitation Specialist. The ideal candidate will hold a bachelor's degree in a data science-related field and bring internship or project experience that demonstrates curiosity, initiative, and a willingness to learn from senior team members. This role is a great opportunity for someone eager to grow their technical skill set while supporting a high-impact mission. Required Qualifications Active TS/SCI clearance with the willingness to obtain a CI polygraph Ability to work onsite in Northern Virginia, 40 hours per week (telework options are extremely limited) Proficiency with Python and SQL Preferred Qualifications Familiarity with GEOINT collection and related NGA/NRO systems Experience with additional programming languages such as R, JavaScript, HTML, and CSS Understanding of object-oriented programming Experience using visualization tools such as Grafana, Tableau, or Kibana Ability to quickly learn new technologies, adapt to evolving mission requirements, and support the development/testing of new analytic methodologies
    $64k-83k yearly est. 4d ago
  • Data Architect

    Mindlance 4.6company rating

    Data engineer job in Washington, DC

    Job Title: Developer Premium I Duration: 7 Months with long term extension Hybrid Onsite: 4 days per week from Day 1, with a full transition to 100% onsite anticipated soon Job Requirement: Strong expertise in Data Architecture & Date model design. MS Azure (core experiment) Experience with SAP ECC preferred SAFE agile certification is a plus Ability to work flexibility including off hours to support critical IT task & migration activities. Educational Qualifications and Experience: Bachelor's degree in Computer Science, Information Systems or in a related area of expertise. Required number of years of proven experience in the specific technology/toolset as per Experience Matrix below for each Level. Essential Job Functions: Take functional specs and produce high quality technical specs Take technical specs and produce completed and well tested programs which meet user satisfaction and acceptance, and precisely reflect the requirements - business logic, performance, and usability requirements Conduct/attend requirements definition meetings with end-users and document system/business requirements Conduct Peer Review on Code and Test Cases, prepared by other team members, to assess quality and compliance with coding standards As required for the role, perform end-user demos of proposed solution and finished product, provide end user training and provide support for user acceptance testing As required for the role, troubleshoot production support issues and find appropriate solutions within defined SLA to ensure minimal disruption to business operations Ensure that Bank policies, procedures, and standards are factored into project design and development As required for the role, install new release, and participate in upgrade activities As required for the role, perform integration between systems that are on prem and also on the cloud and third-party vendors As required for the role, collaborate with different teams within the organization for infrastructure, integration, database administration support Adhere to project schedules and report progress regularly Prepare weekly status reports and participate in status meetings and highlight issues and constraints that would impact timely delivery of work program items Find the appropriate tools to implement the project Maintain knowledge of current industry standards and practices As needed, interact and collaborate with Enterprise Architects (EA), Office of Information Security (OIS) to obtain approvals and accreditations “Mindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of - Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans.”
    $93k-124k yearly est. 22h ago
  • Senior Data Engineer

    Zillion Technologies, Inc. 3.9company rating

    Data engineer job in McLean, VA

    The candidate must have 5+ years of hands on experience working with PySpark/Python, microservices architecture, AWS EKS, SQL, Postgres, DB2, Snowflake, Behave OR Cucumber frameworks, Pytest (unit testing), automation testing and regression testing. Experience with tools such as Jenkins, SonarQube AND/OR Fortify are preferred for this role. Experience in Angular and DevOps are nice to haves for this role. Must Have Qualifications: PySpark/Python based microservices, AWS EKS, Postgres SQL Database, Behave/Cucumber for automation, Pytest, Snowflake, Jenkins, SonarQube and Fortify. Responsibilities: Development of microservices based on Python, PySpark, AWS EKS, AWS Postgres for a data-oriented modernization project. New System: Python and PySpark, AWS Postgres DB, Behave/Cucumber for automation, and Pytest Perform System, functional and data analysis on the current system and create technical/functional requirement documents. Current System: Informatica, SAS, AutoSys, DB2 Write automated tests using Behave/cucumber, based on the new micro-services-based architecture Promote top code quality and solve issues related to performance tuning and scalability. Strong skills in DevOps, Docker/container-based deployments to AWS EKS using Jenkins and experience with SonarQube and Fortify. Able to communicate and engage with business teams and analyze the current business requirements (BRS documents) and create necessary data mappings. Preferred strong skills and experience in reporting applications development and data analysis Knowledge in Agile methodologies and technical documentation.
    $77k-109k yearly est. 4d ago
  • Lead Principal Data Solutions Architect

    Inadev

    Data engineer job in Reston, VA

    *****TO BE CONSIDERED, CANDIDATES MUST BE U.S. CITIZEN***** ***** TO BE CONSIDERED, CANDIDATES MUST BE LOCAL TO THE DC/MD/VA METRO AREA AND BE OPEN TO A HYBIRD SCHEDULE IN RESTON, VA***** Formed in 2011, Inadev is focused on its founding principle to build innovative customer-centric solutions incredibly fast, secure, and at scale. We deliver world-class digital experiences to some of the largest federal agencies and commercial companies. Our technical expertise and innovations are comprised of codeless automation, identity intelligence, immersive technology, artificial intelligence/machine learning (AI/ML), virtualization, and digital transformation. POSITION DESCRIPTION: Inadev is seeking a strong Lead Principal Data Solutions Architect Primary focus will be in Natural language processing (NLP), applying data mining techniques, doing statistical analysis and building high quality prediction systems. PROGRAM DESCRIPTION: This initiative focuses on modernizing and optimizing a mission-critical data environment within the immigration domain to enable advanced analytics and improved decision-making capabilities. The effort involves designing and implementing a scalable architecture that supports complex data integration, secure storage, and high-performance processing. The program emphasizes agility, innovation, and collaboration to deliver solutions that meet evolving stakeholder requirements while maintaining compliance with stringent security and governance standards. RESPONSIBILITES: Leading system architecture decisions, ensuring technical alignment across teams, and advocating for best practices in cloud and data engineering. Serve as a senior technical leader and trusted advisor, driving architectural strategy and guiding development teams through complex solution design and implementation Serve as the lead architect and technical authority for enterprise-scale data solutions, ensuring alignment with strategic objectives and technical standards. Drive system architecture design, including data modeling, integration patterns, and performance optimization for large-scale data warehouses. Provide expert guidance to development teams on Agile analytics methodologies and best practices for iterative delivery. Act as a trusted advisor and advocate for the government project lead, translating business needs into actionable technical strategies. Oversee technical execution across multiple teams, ensuring quality, scalability, and security compliance. Evaluate emerging technologies and recommend solutions that enhance system capabilities and operational efficiency. NON-TECHNICAL REQUIREMENTS: Must be a U.S. Citizen. Must be willing to work a HYRBID Schedule (2-3 Days) in Reston, VA & client locations in the Northern Virginia/DC/MD area as required. Ability to pass a 7-year background check and obtain/maintain a U.S. Government Clearance Strong communication and presentation skills. Must be able to prioritize and self-start. Must be adaptable/flexible as priorities shift. Must be enthusiastic and have passion for learning and constant improvement. Must be open to collaboration, feedback and client asks. Must enjoy working with a vibrant team of outgoing personalities. MANDATORY REQUIREMENTS/SKILLS: Bachelor of Science degree in Computer Science, Engineering or related subject and at least 10 years of experience leading architectural design of enterprise-level data platforms, with significant focus on Databricks Lakehouse architecture. Experience within the Federal Government, specifically DHS is preferred. Must possess demonstrable experience with Databricks Lakehouse Platform, including Delta Lake, Unity Catalog for data governance, Delta Sharing, and Databricks SQL for analytics and BI workloads. Must demonstrate deep expertise in Databricks Lakehouse architecture, medallion architecture (Bronze/Silver/Gold layers), Unity Catalog governance framework, and enterprise-level integration patterns using Databricks workflows and Auto Loader. Knowledge of and ability to organize technical execution of Agile Analytics using Databricks Repos, Jobs, and collaborative notebooks, proven by professional experience. Expertise in Apache Spark on Databricks, including performance optimization, cluster management, Photon engine utilization, and Delta Lake optimization techniques (Z-ordering, liquid clustering, data skipping). Proficiency in Databricks Unity Catalog for centralized data governance, metadata management, data lineage tracking, and access control across multi-cloud environments. Experience with Databricks Delta Live Tables (DLT) for declarative ETL pipeline development and data quality management. Certification in one or more: Databricks Certified Data Engineer Associate/Professional, Databricks Certified Solutions Architect, AWS, Apache Spark, or cloud platform certifications. DESIRED REQUIREMENTS/SKILLS: Expertise in ETL tools. Advanced knowledge of cloud platforms (AWS preferred; Azure or GCP a plus). Proficiency in SQL, PL/SQL, and performance tuning for large datasets. Understanding of security frameworks and compliance standards in federal environments. PHYSICAL DEMANDS: Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions Inadev Corporation does not discriminate against qualified individuals based on their status as protected veterans or individuals with disabilities and prohibits discrimination against all individuals based on their race, color, religion, sex, sexual orientation/gender identity, or national origin.
    $84k-115k yearly est. 2d ago
  • Google Cloud Data Engineer

    Guidehouse 3.7company rating

    Data engineer job in McLean, VA

    Job Family: Data Science Consulting Travel Required: Up to 10% Clearance Required: Ability to Obtain Public Trust What You Will Do: Guidehouse is seeking an experienced Data Engineer to join our Technology AI and Data practice within the Defense & Security segment. This individual will have a strong data engineering background and be a hands-on technical contributor, responsible for designing, implementing, and maintaining scalable, cloud-native data pipelines which power interactive dashboards that enable federal clients to achieve mission outcomes, operational efficiency, and digital transformation. This is an exciting opportunity for someone who thrives at the intersection of data engineering, Google Cloud technologies, and public sector modernization. The Data Engineer will collaborate with cross-functional teams and client stakeholders to modernize legacy environments, implement scalable BigQuery-centric data pipelines using Dataform and Python, and support advanced analytics initiatives for our federal client within the insurance space. Client Leadership & Delivery Collaborate with government clients to understand enterprise data architecture, ingestion, transformation, and reporting requirements within a Google Cloud Platform (GCP) environment. Communicate technical designs, tradeoffs, and delivery timelines clearly to both technical and non-technical audiences. Lead the development of extract-transform-load (ETL) and extract-load-transform (ELT) pipelines using Cloud Composer (GCP hosted Airflow), Dataform, and BigQuery to support our analytical data warehouse powering downstream Looker dashboards. Adhere to high-quality delivery standards and promote measurable outcomes across data migration and visualization efforts. Solution Development & Innovation Design, develop, and maintain scalable ETL/ELT pipelines using SQL (BigQuery), Dataform (SQLX), Cloud Storage, and Python (Cloud Composer/Airflow, Cloud Functions). Apply modern ELT/ETL and analytics engineering practices using BigQuery and Dataform to enable version-controlled, testable, and maintainable data transformations. Leverage tools such as Gitlab and Github to manage version control, merge requests, and promotion pipelines. Optimize data pipelines and warehouse performance for large-scale analytical workloads, including partitioning, clustering, incremental processing, and cost optimization to enable downstream BI utilizing Looker. Validate compliance with federal data governance, security, and performance standards. Design and document enterprise data models, metadata strategies, data lineage frameworks, and other relevant documentation, as needed. Align data from multiple discrete datasets into a cohesive, interoperable architecture, identifying opportunities for linkages between datasets, normalization, field standardization, etc. Assist with cleanup of existing data and models, including use of ETL. Practice & Team Leadership Work closely with data architects, data scientists, data analysts, and cloud engineers to deliver integrated solutions. Collaborate across Scaled Agile Framework (SAFe) teams and participate in Agile ceremonies including standups, retrospectives, and Program Increment (PI) planning. Manage tasks and consistently document progress and outcomes using Confluence and Jira. Support documentation, testing, and deployment of data products. Mentor junior team members and contribute to reusable frameworks and accelerators. Contribute to thought leadership, business development, and best practice development across the AI & Data team. What You Will Need: US Citizenship and the ability to obtain and maintain a federal Public Trust clearance. Individuals with an active Public Trust clearance are preferred. Bachelor's degree in computer science, engineering, mathematics, statistics, or a related technical field. Minimum Three (3) years of experience in data engineering within cloud environments. Strong proficiency in SQL for data modeling and data quality tests and Python for pipeline design. Comfortable with the command line in Linux for git, deploying code to the cloud, and interacting with cloud files. Experience with orchestration tools including but not limited to Cloud Composer (Airflow), Luigi, Prefect, Dagster, etc. Experience with modern analytics engineering toolsets (dbt, Dataform, Databricks, etc.) and familiarity with best practices and methodologies behind the tools (data lineage, dependency graphs, tags, data quality tests, etc.). Proven experience with business intelligence tools and cloud platforms (AWS, Azure, GCP). Familiarity with CI/CD practices, principles, and tools such as Gitlab, separation of environments, and idempotency as it relates to data pipelines. Experience building ETL/ELT pipelines and integrating data sources into reporting platforms. Familiarity with data governance, metadata, and compliance frameworks. Excellent communication, facilitation, and stakeholder engagement skills. What Would Be Nice To Have: Master's degree in computer science, engineering, mathematics, statistics, or a related technical field. Experience in data engineering within cloud environments. Familiarity with machine learning (ML) and optimal data configurations to support model workloads. Familiarity with Agile project management methodologies and Atlassian toolsets (Confluence, Jira). Understanding of data quality and data pipeline test procedures and best practices. Experience working within matrixed teams of data engineers, BI developers, and QA engineers. Experience implementing cloud data governance and data management tools such as Dataplex. Experience with serverless architectures including cloud functions. Experience with Javascript to support development in Dataform. Experience using LLM-based coding assistants such as Gemini Code Assist to automate and streamline software development tasks. Relevant GCP certifications such as GCP Professional Data Engineer, GCP Professional Cloud Architect, and GCP Professional ML Engineer. Experience working with public sector clients. Familiarity with federal contracting and procurement processes. What We Offer: Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace. Benefits include: Medical, Rx, Dental & Vision Insurance Personal and Family Sick Time & Company Paid Holidays Position may be eligible for a discretionary variable incentive bonus Parental Leave and Adoption Assistance 401(k) Retirement Plan Basic Life & Supplemental Life Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts Short-Term & Long-Term Disability Student Loan PayDown Tuition Reimbursement, Personal Development & Learning Opportunities Skills Development & Certifications Employee Referral Program Corporate Sponsored Events & Community Outreach Emergency Back-Up Childcare Program Mobility Stipend About Guidehouse Guidehouse is an Equal Opportunity Employer-Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation. Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco. If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at ************** or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation. All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or ************************. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process. If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse's Ethics Hotline. If you want to check the validity of correspondence you have received, please contact *************************. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant's dealings with unauthorized third parties. Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.
    $73k-97k yearly est. Auto-Apply 1d ago
  • Cloud Data Architect

    Infocepts 3.7company rating

    Data engineer job in McLean, VA

    Purpose: As a Cloud Data Architect, you'll be at the forefront of innovation - guiding clients and teams through the design and implementation of cutting-edge solutions using Databricks, modern data platforms, and cloud-native technologies. In this role, you won't just architect solutions -you'll help grow a thriving Analytics & Data Management practice, act as a trusted Databricks SME, and bring a business-first mindset to every challenge. You'll have the opportunity to lead delivery efforts, build transformative data solutions, and cultivate strategic relationships with Fortune-500 organizations. Key Result Areas and Activities: Architect and deliver scalable, cloud-native data solutions across various industries. Lead data strategy workshops and AI/ML readiness assessments. Develop solution blueprints leveraging Databricks (Lakehouse, Delta Lake, MLflow, Unity Catalog). Conduct architecture reviews and build proof-of-concept (PoC) prototypes on platforms like Databricks, AWS, Azure, and Snowflake. Engage with stakeholders to define and align future-state data strategies with business outcomes. Mentor and lead data engineering and architecture teams. Drive innovation and thought leadership across client engagements and internal practice areas. Promote FinOps practices, ensuring cost optimization within multi-cloud deployments. Support client relationship management and engagement expansion through consulting excellence. Roles & Responsibilities Essential Skills: 10+ years of experience designing and delivering scalable data architecture and solutions. 5+ years in consulting, with demonstrated client-facing leadership. Expertise in Databricks ecosystem including Delta Lake, Lakehouse, Unity Catalog, and MLflow. Strong hands-on knowledge of cloud platforms (Azure, AWS, Databricks, and Snowflake). Proficiency in Spark and Python for data engineering and processing tasks. Solid grasp of enterprise data architecture frameworks such as TOGAF and DAMA. Demonstrated ability to lead and mentor teams, manage multiple projects, and drive delivery excellence. Excellent communication skills with proven ability to consult and influence executive stakeholders. Desirable Skills Recognized thought leadership in emerging data and AI technologies. Experience with FinOps in multi-cloud environments, particularly with Databricks and AWS cost optimization. Familiarity with data governance and data quality best practices at the enterprise level. Knowledge of DevOps and MLOps pipelines in cloud environments. Qualifications: Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or related fields. Professional certifications in Databricks, AWS, Azure, or Snowflake preferred. TOGAF, DAMA, or other architecture framework certifications are a plus. Qualities: Self-motivated and focused on delivering outcomes for a fast growing team and firm Able to communicate persuasively through speaking, writing, and client presentations Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team Able to iterate based on new information, peer reviews, and feedback Able to work with teams and clients in different time zones Research focused mindset "Infocepts is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law."
    $87k-121k yearly est. 22h ago

Learn more about data engineer jobs

How much does a data engineer earn in Fairland, MD?

The average data engineer in Fairland, MD earns between $70,000 and $127,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Fairland, MD

$95,000

What are the biggest employers of Data Engineers in Fairland, MD?

Job type you want
Full Time
Part Time
Internship
Temporary