Senior GEOINT Analyst
Data analyst job in Springfield, VA
MANTECH seeks a motivated, career and customer-oriented Senior GEOINT Analyst to join our team in Springfield, VA
Job duties include, but are not limited to:
Conduct GEOINT analysis on national security issues using imagery, geospatial data, and multi-INT sources to identify trends, events, and relationships.
Integrate and coordinate intelligence across agencies, mission partners, and regional/functional offices to support policy makers, the IC, DoD, NGA, ASG, and allied organizations.
Research and apply structured observation management (SOM) and activity-based intelligence (ABI) techniques to enhance mission-specific analysis.
Produce accurate, timely, and relevant GEOINT products, including reports, database remarks, baseline descriptions, graphics, maps, infographics, and briefings.
Extract, acquire, and manage geospatial information (e.g., shapefiles, geo-databases) for visualization, modeling, and intelligence analysis.
Communicate findings effectively in written, visual, and oral formats tailored to mission requirements.
Prioritize and manage multiple tasks in dynamic environments while ensuring high analytical accuracy and relevance.
Minimum Requirements:
HS Diploma and 10+ years of GEOINT experience OR Associate's degree and 8+ years of experience OR Bachelor's degree and 6+ years of experience OR Master's degree + 6 years of experience
Experience with commercial and NTM imagery sources, search missions with softcopy tools, IEC exploitation workstations, and military infrastructure/order of battle analysis.
Experience conducting historical imagery research.
Experience and proficiency with MS Word, PowerPoint, database entry, and graphic design principles.
Desired Qualifications
Regional expertise to AOR, expertise in photogrammetry, remote sensing, or image processing.
Strong knowledge of the intelligence collection process to include NGA's relationships with other IC Agencies.
MS Word, PowerPoint, database entry and graphic design principles skills.
Security Clearance Requirements:
Active TS/SCI with the ability to obtain & maintain a Poly
Physical Requirements:
Must be able to remain in a stationary position 50%.
Must be able to communicate, converse, and exchange information with peers and senior personnel.
The person in this position needs to occasionally move about inside the office to access file cabinets, office machinery, etc.
Data Analyst
Data analyst job in Bethesda, MD
Job Family:
Data Science Consulting
Travel Required:
None
Clearance Required:
Active Top Secret SCI with Polygraph
What You Will Do: Work with a senior leader to apply data analytics principles to transform raw data into actionable insights, incorporating emerging trends and available initiatives, to inform financial management and budgetary strategy for a Federal C-suite client. Deliver innovative processes to integrate disparate data utilizing tools such as Tableau, Microsoft BI, and/or Qlik.
What You Will Need:
An ACTIVE and MAINTAINED TS/SCI Federal or DoD Security Clearance with a COUNTERINTELLIGENCE (CI) polygraph
Bachelor's degree in Data Science, Computer Science, Management Information Systems, Systems Engineering, Information Technology, or relevant degree program
Minimum of FIVE (5) years of experience in information technology, systems, and/or data analytics in the Federal government
Experience in SQL and Python
What Would Be Nice To Have:
Prior experience with cloud-based applications and data sources
Experience with data visualization tools such as Tableau, Microsoft BI, and/or Qlik
The annual salary range for this position is $113,000.00-$188,000.00. Compensation decisions depend on a wide range of factors, including but not limited to skill sets, experience and training, security clearances, licensure and certifications, and other business and organizational needs.
What We Offer:
Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace.
Benefits include:
Medical, Rx, Dental & Vision Insurance
Personal and Family Sick Time & Company Paid Holidays
Position may be eligible for a discretionary variable incentive bonus
Parental Leave and Adoption Assistance
401(k) Retirement Plan
Basic Life & Supplemental Life
Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts
Short-Term & Long-Term Disability
Student Loan PayDown
Tuition Reimbursement, Personal Development & Learning Opportunities
Skills Development & Certifications
Employee Referral Program
Corporate Sponsored Events & Community Outreach
Emergency Back-Up Childcare Program
Mobility Stipend
About Guidehouse
Guidehouse is an Equal Opportunity Employer-Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation.
Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco.
If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at ************** or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation.
All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or ************************. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process.
If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse's Ethics Hotline. If you want to check the validity of correspondence you have received, please contact *************************. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant's dealings with unauthorized third parties.
Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.
Auto-ApplyBusiness Data Analyst (Mortgage)
Data analyst job in Reston, VA
Business Analyst is responsible for leading the functional requirements gathering team. The candidate works directly with internal customers to understand the business environment and needs. Identifies relevant design, process and specification issues and then mentors/assists lower level Business Analysts to document and translate these business requirements. The candidate may be required to manage business and/or system issues during project life cycle as well as post implementation. Skills: 1) Expertise with Software Development Lifecycle (SDLC) 2) Strong oral and written communication skills 3) In-depth knowledge of client-server, object-oriented, and web-based systems, applications, environments and relevant tools/technology 4) Prior management experience 5) Strong analytical skills. Ability to identify and evaluate several alternative solutions and help the team arrive at the best functional requirement set to meet the business need 6) Knowledge of requirements tools such as Rational Requisite Pro desired Education/Work Experience: Bachelor Degree or Equivalent 10+ years software development experience with experience with projects of similar scope and complexity.
Data Analyst with TS/SCI (required)
Data analyst job in Washington, DC
VETS, Inc., is looking to add an experienced Data Analyst to our growing team. This is a permanent, full time position with full benefits working onsite at the Pentagon. This position requires an active/current DoD TS-SCI clearance. The successful candidate will:
Generate and maintain information on intelligence workforce
Work with other data analysts to visualize data derived for DOD and IC databases, showcasing baseline and trend analysis as well as comparative analysis of data.
Conduct data analyses and predictive modeling of human capital program performance
Conduct annual analysis of performance management results scores, by component, to determine ratings distributions by component, occupation, work-level, and GG grade and demographic group using EEOC reporting categories.
Conduct annual analysis of performance-based payouts of base pay and bonuses
REQUIRED SKILLS
Four-year undergraduate degree from a nationally accredited university 2+ years experience in database and business intelligence reporting tools.
2+ years of subject matter expertise experience working with intelligence information systems or architecture supporting the combatant commands, services, or the Intelligence Community.
2+ years of experience data-mining in databases such as CMIS, DCPDS, ADVANA and other similar tool; visualizing analysis discovered from data-mining
1+ years experience with visualizing data in databases
1+ years experience with visualization tools such as Tableau, Qlik
1+ years experience with coding, programing and scripting languages in SQL, Java, python, R.
1+ years experience with Microsoft excel: pivot tables, graphs, charts and formulas.
1+ years experience in basic statistics with demonstrated experience in inferential statistics.
Business Analyst - WFM
Data analyst job in Washington, DC
WFM Scheduling Lead- Dayforce Permanent $90,000-$130,000 + package Remote (US Based) A large, global organisation operating in a complex, 24/7, high-volume environment is embarking on a major workforce transformation programme. As part of this, we're recruiting a Senior WFM Scheduling Lead to take ownership of multi-site rostering and drive the development of a consistent, scalable scheduling model across multiple regions.
You will work closely with operational leaders, planners and HR teams to understand local rules, working patterns, contract structures and skills requirements, then translate these into structured, system-ready scheduling logic. This role suits someone who thrives in demanding environments, enjoys solving complex rostering challenges and is confident influencing stakeholders at all levels.
Key Responsibilities
Lead the scheduling and rostering workstream across multiple UK and US sites, ensuring local rules and operational practices are accurately reflected in the WFM system.
Analyse, map and standardise scheduling processes, working with planners and station managers to move towards a unified, scalable model.
Shape and validate Dayforce scheduling configuration, testing and optimisation, supporting future adoption of more advanced functionality.
Experience Required
Strong hands on background in workforce management, rostering or scheduling within large, multi site, shift based environments.
Dayforce WFM scheduling experience with the ability to translate complex operational rules into system logic.
Confident stakeholder management style; comfortable influencing operational leaders and navigating demanding, change heavy environments.
WFM Scheduling Lead- Dayforce
Permanent
$90,000-$130,000 + package
Remote (US Based)
Data Scientist
Data analyst job in Washington, DC
Duration: 6 Months with possibly extension
AI Engineer
1. Background and Context
The AI Engineer will play a pivotal role in designing, developing, and deploying artificial intelligence solutions that enhance operational efficiency, automate decision-making, and support strategic initiatives for the environmental and social specialists in the Bank.
This role is central to the VPU's digital transformation efforts and will contribute to the development of scalable, ethical, and innovative AI systems.
2. Qualifications and Experience
Education
Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or related field.
Experience
Minimum 3 years of experience in AI/ML model development and deployment.
Experience with MLOps tools (e.g., MLflow), Docker, and cloud platforms (AWS, Azure, GCP).
Proven track record in implementing LLMs, RAG, NLP model development and GenAI solutions.
Technical Skills
Skilled in - Azure AI/Google Vertex Search, Vector Databases, fine-tuning the RAG, NLP model development, API Management (facilitates access to different sources of data)
Proficiency in Python, TensorFlow, PyTorch, and NLP frameworks.
Expertise deep learning, computer vision, and large language models.
Familiarity with REST APIs, NoSQL, and RDBMS.
Soft Skills
Strong analytical and problem-solving abilities.
Excellent communication and teamwork skills.
Strategic thinking and innovation mindset.
3. Certifications (Preferred)
Microsoft Certified: Azure AI Engineer Associate
Google Machine Learning Engineer
SAFe Agile Software Engineer (ASE)
Certification in AI Ethics
4. Objectives of the Assignment
Develop and implement AI models and algorithms tailored to business needs.
Integrate AI solutions into existing systems and workflows.
Ensure ethical compliance and data privacy in all AI initiatives.
Support user adoption through training and documentation.
Support existing AI solutions by refinement, troubleshooting, and reconfiguration
5. Scope of Work and Responsibilities
AI Solution Development
Collaborate with cross-functional teams to identify AI opportunities.
Train, validate, and optimize machine learning models.
Translate business requirements to technical specifications.
AI Solution Implementation
Develop code, deploy AI models and into production environments, and conduct ongoing model training
Monitor performance and troubleshoot issues and engage in fine-tuning the solutions to improve accuracy
Ensure compliance with ethical standards and data governance policies.
User Training and Adoption
Conduct training sessions for stakeholders on AI tools.
Develop user guides and technical documentation.
Data Analysis and Research
Collect, preprocess, and engineer large datasets for machine learning and AI applications.
Recommend and Implement Data Cleaning and Preparation
Analyze and use structured and unstructured data (including geospatial data) to extract features and actionable insights.
Monitor data quality, detect bias, and manage model/data drift in production environments.
Research emerging AI technologies and recommend improvements.
Governance, Strategy, Support, and Maintenance
Advise client Staff on AI strategy and policy implications
Contribute to the team's AI roadmap and innovation agenda.
Provide continuous support and contribute towards maintenance and future enhancements.
4. Deliverables
Work on Proof of Concepts to study the technical feasibility of AI Use Cases
Functional AI applications integrated into business systems.
Documentation of model/application architecture, training data, and performance metrics.
Training materials and user guides.
Develop, train, and deploy AI models tailored to business needs.
About US Tech Solutions:
US Tech Solutions is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit ************************
US Tech Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Recruiter Details:
Name: Pooja Rani
Email: ******************************
Internal Id: 25-53638
Azure Data Modeler
Data analyst job in Washington, DC
Azure Data Modeler - Budget Transformation Project
Our client is embarking on a major budget transformation initiative and is seeking an experienced Azure Data Modeler to support data architecture, modeling, and migration activities. This role will play a critical part in designing and optimizing data structures as the organization transitions to SAP. Experience with SAP is preferred, but strong ERP data experience in any platform is also valuable.
Responsibilities
Design, develop, and optimize data models within the Microsoft Azure environment.
Support data architecture needs across the budget transformation program.
Partner with cross-functional stakeholders to enable the transition to SAP (or other ERP systems).
Participate in data migration planning, execution, and validation efforts.
Work collaboratively within SAFe Agile teams and support sprint activities.
Provide off-hours support as needed for critical tasks and migration windows.
Engage onsite in Washington, DC up to three days per week.
Required Qualifications
Strong hands-on expertise in data architecture and data model design.
Proven experience working with Microsoft Azure (core requirement).
Ability to work flexibly, including occasional off-hours support.
Ability to be onsite in Washington, DC as needed (up to 3 days/week).
Preferred Qualifications
Experience with SAP ECC or exposure to SAP implementations.
Experience with other major ERP systems (Oracle, Workday, etc.).
SAFe Agile certification.
Dexian stands at the forefront of Talent + Technology solutions with a presence spanning more than 70 locations worldwide and a team exceeding 10,000 professionals. As one of the largest technology and professional staffing companies and one of the largest minority-owned staffing companies in the United States, Dexian combines over 30 years of industry expertise with cutting-edge technologies to deliver comprehensive global services and support.
Dexian connects the right talent and the right technology with the right organizations to deliver trajectory-changing results that help everyone achieve their ambitions and goals. To learn more, please visit ********************
Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
Business Analyst/Scrum Master (federal Government Project)
Data analyst job in Washington, DC
We are seeking an analytical and detail-oriented Business Analyst / Scrum Master to support a federal government client in advancing AI and automation initiatives. This position requires a strong ability to gather and translate requirements from diverse stakeholders ranging from technical teams to government leadership into actionable deliverables and technical solutions. The ideal candidate will have a strong background in business analysis, Agile project management, technical solution quality and delivery, and ensuring alignment between business needs and system capabilities.
Key Responsibilities:
Lead requirement gathering sessions with stakeholders to capture operational needs, AI and automation opportunities, and workflow improvement areas.
Analyze current-state processes to identify inefficiencies, redundancies, or areas suitable for AI and automation.
Recommend AI and automation strategies that reduce manual effort and improve consistency across government systems.
Design and implement AI and automation workflows informed by operational requirements, business rules, and known system capabilities.
Collaborate with senior developers, project managers, testers and client stakeholders to ensure requirements are fully understood and implemented.
Assess post-implementation data and user feedback to recommend enhancements to business workflows.
Lead Agile team activities including backlog refinement, sprint planning, and user story refinement.
Maintain clear and organized documentation using tools like Confluence; manage task tracking through JIRA.
Maintain clear and concise documentation for automations, provide regular and transparent updates on task progress.
Required Qualifications:
Bachelor's degree.
1-3 years of experience as a Business Analyst, Project Manager, or Product
Owner, ideally within federal government or consulting environments.
Exceptional attention to detail and ability to break down complex processes clearly and logically.
Strong interpersonal and communication skills, with the ability to tailor messaging for both technical and non-technical audiences.
Experience in a project-based technical delivery environment.
Nice to Have:
Experience with DHS or USCIS.
Experience supporting Agile teams using tools such as JIRA and Confluence.
Experience with UiPath, AWS, and generative AI solutions.
Data Scientist
Data analyst job in Columbia, MD
Data Scientist - Transit Data Focus_Columbia, MD (On-site / hybrid)_Contract (6 Months)
Data Scientist - Transit Data Focus
Employment type: Contract
Duration: 6 Months
Justification: To manage and analyze customer databases, AVA (automated voice announcement), and schedule data for predictive maintenance and service planning.
Experience Level: 3-5 years
Job Responsibilities:
Collect, process, and analyze transit-related datasets including customer databases, AVA (automated voice announcement) logs, real-time vehicle data, and schedule data.
Develop predictive models and data-driven insights to support maintenance forecasting, service planning, and operational optimization.
Design and implement data pipelines to integrate, clean, and transform large, heterogeneous transit data sources.
Perform statistical analysis and machine learning to identify patterns, trends, and anomalies relevant to transit service performance and reliability.
Collaborate with transit planners, maintenance teams, and IT staff to translate data insights into actionable business strategies.
Monitor data quality and integrity; implement data validation and cleansing processes.
Technical Skills & Qualifications:
Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Transportation Engineering, or a related quantitative field.
3-5 years of experience working as a data scientist or data analyst, preferably in a transit, transportation, or public sector environment.
Strong proficiency in Python or R for data analysis, statistical modeling, and machine learning.
Experience with SQL for database querying, manipulation, and data extraction.
Familiarity with transit data standards such as GTFS, AVL/CAD, APC (Automated Passenger Counters), and AVA systems.
Experience with data visualization tools such as Power BI, or equivalent.
Business Analyst
Data analyst job in Reston, VA
GeBBS Consulting is a healthcare technology professional services and consulting firm based in Towson, Maryland servicing clients throughout the US. We are celebrating our 28th year in business. We have two divisions: 1) Hospital / Provider Practice 2) Managed Care / Payer Practice - we work directly with health plans (very strong in the Blue Cross Blue Shield space). GeBBS Consulting has over two decades of experience supporting healthcare clients throughout the US.
We are currently hiring a Business Analyst
Summary: One of our clients in Blue Cross Blue Shield is looking for a Business Analyst to support enhancement or maintenance of a membership portal. The hired employee will be responsible for working with Product Owners, working with developers and testers, and write user stories all within an Agile environment.
Required Skills:
3-8+ years of experience as a Business Analyst in an enterprise-level environment (multiple teams within an Agile environment) working on a software upgrade/maintenance.
Experience writing user stories
Location: Hybrid, 1-2 times a week in Reston, VA
If you are interested in exploring this career opportunity with GeBBS Consulting, please reply with your current resume and the best time to contact you.
Recruiting Fraud
Over the past year, online recruitment scams have increased frequently, impacting both applicants and employers. To help protect yourself against potential scammers, please note the following recruitment practices employed by GeBBS Consulting.
GeBBS Consulting uses a single domain name for all recruiting activities and all Authorized GeBBS Consulting Recruiters use the same domain name for all email correspondences. The official GeBBS Consulting domain is gebbsconsulting.com. If you receive emails or are directed to an email other than GeBBSconsulting.com, you are not responding to GeBBS Consulting.
GeBBS Consulting screens applicants through a combination of over-the-phone, video and in-person meetings.
GeBBS Consulting will never ask a candidate for payment of any kind as part of the hiring or onboarding process.
Data Architect
Data analyst job in Washington, DC
Job Title: Developer Premium I
Duration: 7 Months with long term extension
Hybrid Onsite: 4 days per week from Day 1, with a full transition to 100% onsite anticipated soon
Job Requirement:
Strong expertise in Data Architecture & Date model design.
MS Azure (core experiment)
Experience with SAP ECC preferred
SAFE agile certification is a plus
Ability to work flexibility including off hours to support critical IT task & migration activities.
Educational Qualifications and Experience:
Bachelor's degree in Computer Science, Information Systems or in a related area of expertise.
Required number of years of proven experience in the specific technology/toolset as per Experience Matrix below for each Level.
Essential Job Functions:
Take functional specs and produce high quality technical specs
Take technical specs and produce completed and well tested programs which meet user satisfaction and acceptance, and precisely reflect the requirements - business logic, performance, and usability requirements
Conduct/attend requirements definition meetings with end-users and document system/business requirements
Conduct Peer Review on Code and Test Cases, prepared by other team members, to assess quality and compliance with coding standards
As required for the role, perform end-user demos of proposed solution and finished product, provide end user training and provide support for user acceptance testing
As required for the role, troubleshoot production support issues and find appropriate solutions within defined SLA to ensure minimal disruption to business operations
Ensure that Bank policies, procedures, and standards are factored into project design and development
As required for the role, install new release, and participate in upgrade activities
As required for the role, perform integration between systems that are on prem and also on the cloud and third-party vendors
As required for the role, collaborate with different teams within the organization for infrastructure, integration, database administration support
Adhere to project schedules and report progress regularly
Prepare weekly status reports and participate in status meetings and highlight issues and constraints that would impact timely delivery of work program items
Find the appropriate tools to implement the project
Maintain knowledge of current industry standards and practices
As needed, interact and collaborate with Enterprise Architects (EA), Office of Information Security (OIS) to obtain approvals and accreditations
“Mindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of - Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans.”
UiPath Business Analyst
Data analyst job in McLean, VA
Candidates local to McLean, VA only needed.
Skills:
UiPath
Document Understanding
We are seeking a detail-oriented UiPath Business Analyst to join our Automation team. The ideal candidate will be responsible for identifying business process automation opportunities, gathering requirements, and collaborating with stakeholders and RPA developers to design and implement solutions using UiPath. This role acts as a bridge between business units and the technical team, ensuring successful delivery of RPA initiatives that drive operational efficiency.
Key Responsibilities:
Analyze existing business processes to identify automation opportunities.
Work with stakeholders to gather, validate, and document business and functional requirements.
Perform process assessments using UiPath's tools and frameworks (e.g., Process Mining, Task Capture).
Translate business needs into clear and concise process definitions and solution designs.
Collaborate with RPA Developers to design and implement automation solutions using UiPath.
Conduct feasibility analysis and ROI estimations for automation candidates.
Participate in solution testing, UAT (User Acceptance Testing), and post-deployment reviews.
Support change management and user training related to RPA implementations.
Monitor RPA performance and provide suggestions for improvements.
Maintain detailed documentation of business processes, workflows, and RPA solutions.
Qualifications:
Bachelor's degree in Business, Information Technology, Computer Science, or a related field.
3+ years of experience as a Business Analyst, with at least 1-2 years in RPA or process automation.
Strong knowledge of UiPath tools and frameworks (e.g., Studio, Orchestrator, Insights).
Experience with process discovery and assessment methodologies.
UiPath RPA Business Analyst
Data analyst job in McLean, VA
Role: RPA Business Analyst (UiPath RPA)
Type: HYBRID- 2 days onsite
Willing to relocate is OK.
UiPath Certification is a big plus.
We are seeking a highly experienced RPA Business Process Analyst with deep expertise in UiPath and Document Understanding
Your insights will help drive operational efficiency and scale automation adoption across the organization.
Key Responsibilities
Lead end-to-end process discovery, mapping, and analysis efforts using Document Understanding, Task Mining or equivalent tools to uncover automation opportunities.
Collaborate with business stakeholders (process owners, subject matter experts) to understand current-state workflows, pain points, KPIs, and desired outcomes.
Elicit, document, and validate functional and non-functional requirements in clear artifacts (e.g. process maps, user stories, use cases, business requirement documents, PDDs).
Perform complexity assessment, ROI analysis, and feasibility studies for automation candidates.
Support pilot implementations, user acceptance testing (UAT), go-live, and post-deployment stabilization.
Act as a liaison between business, IT, and RPA teams to manage change, risks, and stakeholder communication.
Required Qualifications & Skills
5+ years of experience in business process analysis, process improvement, or operations roles with exposure to automation programs.
At least 2-3 years of hands-on experience with Document Understanding, UiPath Task mining / Task Capture, especially in leveraging AI in Process documentation
Proven track record of leading discovery workshops, mapping complex processes, and identifying scalable automation opportunities.
Solid understanding of business process modeling techniques.
Excellent communication and stakeholder management skills - able to bridge between technical teams and business users.
UiPath Certification is big plus.
Junior Data Scientist (TS/SCI)
Data analyst job in Springfield, VA
We are seeking a junior-level Data Science professional with a strong academic foundation and early hands-on experience to join our team as a Exploitation Specialist. The ideal candidate will hold a bachelor's degree in a data science-related field and bring internship or project experience that demonstrates curiosity, initiative, and a willingness to learn from senior team members. This role is a great opportunity for someone eager to grow their technical skill set while supporting a high-impact mission.
Required Qualifications
Active TS/SCI clearance with the willingness to obtain a CI polygraph
Ability to work onsite in Northern Virginia, 40 hours per week (telework options are extremely limited)
Proficiency with Python and SQL
Preferred Qualifications
Familiarity with GEOINT collection and related NGA/NRO systems
Experience with additional programming languages such as R, JavaScript, HTML, and CSS
Understanding of object-oriented programming
Experience using visualization tools such as Grafana, Tableau, or Kibana
Ability to quickly learn new technologies, adapt to evolving mission requirements, and support the development/testing of new analytic methodologies
Cloud Data Architect
Data analyst job in McLean, VA
Purpose:
As a Cloud Data Architect, you'll be at the forefront of innovation - guiding clients and teams through the design and implementation of cutting-edge solutions using Databricks, modern data platforms, and cloud-native technologies. In this role, you won't just architect solutions -you'll help grow a thriving Analytics & Data Management practice, act as a trusted Databricks SME, and bring a business-first mindset to every challenge. You'll have the opportunity to lead delivery efforts, build transformative data solutions, and cultivate strategic relationships with Fortune-500 organizations.
Key Result Areas and Activities:
Architect and deliver scalable, cloud-native data solutions across various industries.
Lead data strategy workshops and AI/ML readiness assessments.
Develop solution blueprints leveraging Databricks (Lakehouse, Delta Lake, MLflow, Unity Catalog).
Conduct architecture reviews and build proof-of-concept (PoC) prototypes on platforms like Databricks, AWS, Azure, and Snowflake.
Engage with stakeholders to define and align future-state data strategies with business outcomes.
Mentor and lead data engineering and architecture teams.
Drive innovation and thought leadership across client engagements and internal practice areas.
Promote FinOps practices, ensuring cost optimization within multi-cloud deployments.
Support client relationship management and engagement expansion through consulting excellence.
Roles & Responsibilities
Essential Skills:
10+ years of experience designing and delivering scalable data architecture and solutions.
5+ years in consulting, with demonstrated client-facing leadership.
Expertise in Databricks ecosystem including Delta Lake, Lakehouse, Unity Catalog, and MLflow.
Strong hands-on knowledge of cloud platforms (Azure, AWS, Databricks, and Snowflake).
Proficiency in Spark and Python for data engineering and processing tasks.
Solid grasp of enterprise data architecture frameworks such as TOGAF and DAMA.
Demonstrated ability to lead and mentor teams, manage multiple projects, and drive delivery excellence.
Excellent communication skills with proven ability to consult and influence executive stakeholders.
Desirable Skills
Recognized thought leadership in emerging data and AI technologies.
Experience with FinOps in multi-cloud environments, particularly with Databricks and AWS cost optimization.
Familiarity with data governance and data quality best practices at the enterprise level.
Knowledge of DevOps and MLOps pipelines in cloud environments.
Qualifications:
Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or related fields.
Professional certifications in Databricks, AWS, Azure, or Snowflake preferred.
TOGAF, DAMA, or other architecture framework certifications are a plus.
Qualities:
Self-motivated and focused on delivering outcomes for a fast growing team and firm
Able to communicate persuasively through speaking, writing, and client presentations
Able to consult, write, and present persuasively
Able to work in a self-organized and cross-functional team
Able to iterate based on new information, peer reviews, and feedback
Able to work with teams and clients in different time zones
Research focused mindset
"Infocepts is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law."
Lead Principal Data Solutions Architect
Data analyst job in Reston, VA
*****TO BE CONSIDERED, CANDIDATES MUST BE U.S. CITIZEN*****
***** TO BE CONSIDERED, CANDIDATES MUST BE LOCAL TO THE DC/MD/VA METRO AREA AND BE OPEN TO A HYBIRD SCHEDULE IN RESTON, VA*****
Formed in 2011, Inadev is focused on its founding principle to build innovative customer-centric solutions incredibly fast, secure, and at scale. We deliver world-class digital experiences to some of the largest federal agencies and commercial companies. Our technical expertise and innovations are comprised of codeless automation, identity intelligence, immersive technology, artificial intelligence/machine learning (AI/ML), virtualization, and digital transformation.
POSITION DESCRIPTION:
Inadev is seeking a strong Lead Principal Data Solutions Architect Primary focus will be in Natural language processing (NLP), applying data mining techniques, doing statistical analysis and building high quality prediction systems.
PROGRAM DESCRIPTION:
This initiative focuses on modernizing and optimizing a mission-critical data environment within the immigration domain to enable advanced analytics and improved decision-making capabilities. The effort involves designing and implementing a scalable architecture that supports complex data integration, secure storage, and high-performance processing. The program emphasizes agility, innovation, and collaboration to deliver solutions that meet evolving stakeholder requirements while maintaining compliance with stringent security and governance standards.
RESPONSIBILITES:
Leading system architecture decisions, ensuring technical alignment across teams, and advocating for best practices in cloud and data engineering.
Serve as a senior technical leader and trusted advisor, driving architectural strategy and guiding development teams through complex solution design and implementation
Serve as the lead architect and technical authority for enterprise-scale data solutions, ensuring alignment with strategic objectives and technical standards.
Drive system architecture design, including data modeling, integration patterns, and performance optimization for large-scale data warehouses.
Provide expert guidance to development teams on Agile analytics methodologies and best practices for iterative delivery.
Act as a trusted advisor and advocate for the government project lead, translating business needs into actionable technical strategies.
Oversee technical execution across multiple teams, ensuring quality, scalability, and security compliance.
Evaluate emerging technologies and recommend solutions that enhance system capabilities and operational efficiency.
NON-TECHNICAL REQUIREMENTS:
Must be a U.S. Citizen.
Must be willing to work a HYRBID Schedule (2-3 Days) in Reston, VA & client locations in the Northern Virginia/DC/MD area as required.
Ability to pass a 7-year background check and obtain/maintain a U.S. Government Clearance
Strong communication and presentation skills.
Must be able to prioritize and self-start.
Must be adaptable/flexible as priorities shift.
Must be enthusiastic and have passion for learning and constant improvement.
Must be open to collaboration, feedback and client asks.
Must enjoy working with a vibrant team of outgoing personalities.
MANDATORY REQUIREMENTS/SKILLS:
Bachelor of Science degree in Computer Science, Engineering or related subject and at least 10 years of experience leading architectural design of enterprise-level data platforms, with significant focus on Databricks Lakehouse architecture.
Experience within the Federal Government, specifically DHS is preferred.
Must possess demonstrable experience with Databricks Lakehouse Platform, including Delta Lake, Unity Catalog for data governance, Delta Sharing, and Databricks SQL for analytics and BI workloads.
Must demonstrate deep expertise in Databricks Lakehouse architecture, medallion architecture (Bronze/Silver/Gold layers), Unity Catalog governance framework, and enterprise-level integration patterns using Databricks workflows and Auto Loader.
Knowledge of and ability to organize technical execution of Agile Analytics using Databricks Repos, Jobs, and collaborative notebooks, proven by professional experience.
Expertise in Apache Spark on Databricks, including performance optimization, cluster management, Photon engine utilization, and Delta Lake optimization techniques (Z-ordering, liquid clustering, data skipping).
Proficiency in Databricks Unity Catalog for centralized data governance, metadata management, data lineage tracking, and access control across multi-cloud environments.
Experience with Databricks Delta Live Tables (DLT) for declarative ETL pipeline development and data quality management.
Certification in one or more: Databricks Certified Data Engineer Associate/Professional, Databricks Certified Solutions Architect, AWS, Apache Spark, or cloud platform certifications.
DESIRED REQUIREMENTS/SKILLS:
Expertise in ETL tools.
Advanced knowledge of cloud platforms (AWS preferred; Azure or GCP a plus).
Proficiency in SQL, PL/SQL, and performance tuning for large datasets.
Understanding of security frameworks and compliance standards in federal environments.
PHYSICAL DEMANDS:
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions
Inadev Corporation does not discriminate against qualified individuals based on their status as protected veterans or individuals with disabilities and prohibits discrimination against all individuals based on their race, color, religion, sex, sexual orientation/gender identity, or national origin.
Data Engineer / Big data Engineer
Data analyst job in McLean, VA
Immediate need for a talented Data Engineer / Big data Engineer. This is a 12 months contract opportunity with long-term potential and is located in Mclean, VA(Hybrid). Please review the job description below and contact me ASAP if you are interested.
Job ID: 25-93504
Pay Range: $70 - $75/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
Design, develop, and maintain data pipelines leveraging Python, Spark/PySpark, and cloud-native services.
Build and optimize data workflows, ETL processes, and transformations for large-scale structured and semi-structured datasets.
Write advanced and efficient SQL queries against Snowflake, including joins, window functions, and performance tuning.
Develop backend and automation tools using Golang and/or Python as needed.
Implement scalable, secure, and high-quality data solutions across AWS services such as S3, Lambda, Glue, Step Functions, EMR, and CloudWatch.
Troubleshoot complex production data issues, including pipeline failures, data quality gaps, and cloud environment challenges.
Perform root-cause analysis and implement automation to prevent recurring issues.
Collaborate with data scientists, analysts, platform engineers, and product teams to enable reliable, high-quality data access.
Ensure compliance with enterprise governance, data quality, and cloud security standards.
Participate in Agile ceremonies, code reviews, and DevOps practices to ensure high engineering quality.
Key Requirements and Technology Experience:
Proficiency in Python with experience building scalable data pipelines or ETL processes.
Strong hands-on experience with Spark/PySpark for distributed data processing.
Experience writing complex SQL queries (Snowflake preferred), including optimization and performance tuning.
Working knowledge of AWS cloud services used in data engineering (S3, Glue, Lambda, EMR, Step Functions, CloudWatch, IAM).
Experience with Golang for scripting, backend services, or performance-critical processes.
Strong debugging, troubleshooting, and analytical skills across cloud and data ecosystems.
Familiarity with CI/CD workflows, Git, and automated testing.
Our client is a leading Banking and Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Lead Data Engineer
Data analyst job in Reston, VA
Imagine working at Intellibus to engineer platforms that impact billions of lives around the world. With your passion and focus we will accomplish great things together!
Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry.
Are you a data enthusiast with a natural ability for analytics? We're looking forward skilled Data/Analytics Engineers to fill multiple roles for our exciting new client. This is your chance to shine, demonstrating your dedication and commitment in a role that promises both challenge and reward.
What We Offer:
A dynamic environment where your skills will make a direct impact. The opportunity to work with cutting-edge technologies and innovative projects. A collaborative team that values your passion and focus.
We are looking for Engineers who can
Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake.
Implement ETL (Extract, Transform, Load) processes using Snowflake's features such as Snowpipe, Streams, and Tasks.
Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs.
Optimize data warehouse performance and scalability using Snowflake features like clustering, partitioning, and materialized views.
Integrate Snowflake with external systems and data sources, including on-premises databases, cloud storage, and third-party APIs.
Implement data synchronization processes to ensure consistency and accuracy of data across different systems.
Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management features.
Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency.
Work on Snowflake modeling - roles, databases, schemas, ETL tools with cloud-driven skills
Work on SQL performance measuring, query tuning, and database tuning
Handle SQL language and cloud-based technologies
Set up the RBAC model at the infra and data level.
Work on Data Masking / Encryption / Tokenization, Data Wrangling / Data Pipeline orchestration (tasks).
Setup AWS S3/EC2, Configure External stages, and SQS/SNS
Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks)
Key Skills & Qualifications:
ETL - Experience with ETL processes for data integration.
SQL - Strong SQL skills for querying and data manipulation
Python - Strong command of Python, especially in AWS Boto3, JSON handling, and dictionary operations
Unix - Competent in Unix for file operations, searches, and regular expressions
AWS - Proficient with AWS services including EC2, Glue, S3, Step Functions, and Lambda for scalable cloud solutions
Database Modeling - Solid grasp of database design principles, including logical and physical data models, and change data capture (CDC) mechanisms.
Snowflake - Experienced in Snowflake for efficient data integration, utilizing features like Snowpipe, Streams, Tasks, and Stored Procedures.
Airflow - Fundamental knowledge of Airflow for orchestrating complex data workflows and setting up automated pipelines
Bachelor's degree in Computer Science, or a related field is preferred. Relevant work experience may be considered in lieu of a degree.
Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and stakeholders.
Proven leadership abilities, with experience mentoring junior developers and driving technical excellence within the team.
We work closely with
Data Wrangling
ETL
Talend
Jasper
Java
Python
Unix
AWS
Data Warehousing
Data Modeling
Database Migration
RBAC model
Data migration
Our Process
Schedule a 15 min Video Call with someone from our Team
4 Proctored GQ Tests (< 2 hours)
30-45 min Final Video Interview
Receive Job Offer
If you are interested in reaching out to us, please apply and our team will contact you within the hour.
Senior Data Engineer
Data analyst job in McLean, VA
The candidate must have 5+ years of hands on experience working with PySpark/Python, microservices architecture, AWS EKS, SQL, Postgres, DB2, Snowflake, Behave OR Cucumber frameworks, Pytest (unit testing), automation testing and regression testing.
Experience with tools such as Jenkins, SonarQube AND/OR Fortify are preferred for this role.
Experience in Angular and DevOps are nice to haves for this role.
Must Have Qualifications: PySpark/Python based microservices, AWS EKS, Postgres SQL Database, Behave/Cucumber for automation, Pytest, Snowflake, Jenkins, SonarQube and Fortify.
Responsibilities:
Development of microservices based on Python, PySpark, AWS EKS, AWS Postgres for a data-oriented modernization project.
New System: Python and PySpark, AWS Postgres DB, Behave/Cucumber for automation, and Pytest
Perform System, functional and data analysis on the current system and create technical/functional requirement documents.
Current System: Informatica, SAS, AutoSys, DB2
Write automated tests using Behave/cucumber, based on the new micro-services-based architecture
Promote top code quality and solve issues related to performance tuning and scalability.
Strong skills in DevOps, Docker/container-based deployments to AWS EKS using Jenkins and experience with SonarQube and Fortify.
Able to communicate and engage with business teams and analyze the current business requirements (BRS documents) and create necessary data mappings.
Preferred strong skills and experience in reporting applications development and data analysis
Knowledge in Agile methodologies and technical documentation.
Cloud Data Engineer- Databricks
Data analyst job in McLean, VA
Purpose:
We are seeking a highly skilled Cloud Data Engineer with deep expertise in Databricks and modern cloud platforms such as AWS, Azure, or GCP. This role is ideal for professionals who are passionate about building next-generation data platforms, optimizing complex data workflows, and enabling advanced analytics and AI in cloud-native environments. You'll have the opportunity to work with Fortune-500 organizations in data and analytics, helping them unlock the full potential of their data through innovative, scalable solutions.
Key Result Areas and Activities:
Design and implement robust, scalable data engineering solutions.
Build and optimize data pipelines using Databricks, including serverless capabilities, Unity Catalog, and Mosaic AI.
Collaborate with analytics and AI teams to enable real-time and batch data workflows.
Support and improve cloud-native data platforms (AWS, Azure, GCP).
Ensure adherence to best practices in data modeling, warehousing, and governance.
Contribute to automation of data workflows using CI/CD, DevOps, or DataOps practices.
Implement and maintain workflow orchestration tools like Apache Airflow and dbt.
Roles & Responsibilities
Essential Skills
4+ years of experience in data engineering with a focus on scalable solutions.
Strong hands-on experience with Databricks in a cloud environment.
Proficiency in Spark and Python for data processing.
Solid understanding of data modeling, data warehousing, and architecture principles.
Experience working with at least one major cloud provider (AWS, Azure, or GCP).
Familiarity with CI/CD pipelines and data workflow automation.
Desirable Skills
Direct experience with Unity Catalog and Mosaic AI within Databricks.
Working knowledge of DevOps/DataOps principles in a data engineering context.
Exposure to Apache Airflow, dbt, and modern data orchestration frameworks.
Qualifications
Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field.
Relevant certifications in cloud platforms (AWS/Azure/GCP) or Databricks are a plus.
Qualities:
Able to consult, write, and present persuasively
Able to work in a self-organized and cross-functional team
Able to iterate based on new information, peer reviews, and feedback
Able to work seamlessly with clients across multiple geographies
Research focused mindset
Excellent analytical, presentation, reporting, documentation and interactive skills
"Infocepts is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law."