Full Stack Engineer - Data Engineer (Hybrid position)
The Planet Group 4.1
Remote job
Senior Full Stack/Data Engineering (C#/.Net, Angular, Azure Databricks, AI/ML)
Pay rate: $50-55 per hour
Employment status: W2 (We can only consider candidates that do not require sponsorship via W2 status with us - C2C H1B candidates cannot be considered right now
Location: DFW area, must be onsite Tues, Wed, Thursday
Contract: 6+ months (chance to convert but not guaranteed)
Looking for a Senior Full Stack & Data Engineering Developer - 3-7 years experience range is ideal. This role spans scalable data pipeline development, ML workflow enablement, and full‑stack (.NET + Angular) engineering in Azure.
Core Responsibilities
Build and optimize data pipelines, ingestion, and transformation workflows (ADF, Databricks).
Enable ML workflows: data prep, feature engineering, deployment support.
Develop full‑stack applications (Angular front end + .NET APIs + SQL backend).
Collaborate with data scientists, architects, and product teams on end‑to‑end solutions.
Ensure data quality, performance, and security across platforms.
Troubleshoot production issues and support continuous improvement.
Contribute to architecture and technical decision‑making.
Skill Mix Breakdown
Angular / Front‑End: 25%
.NET / API / Backend: 25%
Azure + Data Engineering (ADF, Databricks, Azure SQL): 35%
ML / AI Exposure: 15%
Required Skills
3-5+ years with .NET, C#, APIs, and backend development.
Strong experience with Entity Framework, SQL/T‑SQL, and Databricks.
Proficient with Angular and Angular Material UI components.
Experience in Agile, CI, TDD environments.
Solid problem‑solving skills in distributed cloud systems.
Azure cloud experience (highly preferred).
Preferred Skills
Experience building or optimizing machine learning models or AI workflows.
Familiarity with NLP/statistical approaches for solutioning.
Exposure to tools such as Python, R, SAS, SQL, MATLAB, Java.
TypeScript experience.
$50-55 hourly 22h ago
Looking for a job?
Let Zippia find it for you.
Salesforce Commerce SR Developer/Tech Lead
Business Centric Technology
Remote job
Are you passionate about building powerful B2B or B2C commerce solutions on Salesforce? We're looking for a Senior Salesforce Commerce Developer & Technical Lead with deep expertise in Salesforce B2B or B2C Commerce (CloudCraze or Lightning) to drive end-to-end solution delivery and lead technical excellence across our projects. This is a remote OR hybrid, direct-hire position based out of Irving, TX.
COMP: Up to $165,000 depending on candidate's experience, etc.
WHAT'S IN IT FOR YOU:
Remote work schedule
Medical, dental, and vision with employer contributions
Healthcare FSA & telehealth options
Paid vacation, sick leave & holidays
Life & AD&D insurance, plus multiple 401(k) plans
Ongoing learning and development programs
Lead complex, high-visibility projects that drive real business results.
Influence technical direction and mentor future talent.
Work with cutting-edge Salesforce technologies in a collaborative, forward-thinking environment.
WHAT YOU'LL DO:
Architect, design, and lead the development of scalable Salesforce B2B Commerce solutions.
Act as the go-to technical expert for Salesforce Commerce projects-guiding both clients and internal teams.
Conduct code reviews and ensure development best practices across the board.
Customize and enhance Salesforce Commerce features using Apex, LWC, Visualforce, and other platform tools.
Design and implement integrations with ERPs, CRMs, and third-party apps via APIs.
Configure complex business elements like product catalogs, pricing engines, and checkout workflows.
Collaborate with business analysts, solution architects, and project managers to turn business needs into high-performing solutions.
Take ownership of the full SDLC-from requirements gathering through deployment and post-launch support.
Ensure deliverables meet rigorous standards for performance, security, and scalability.
Guide junior developers and foster a culture of knowledge sharing and continuous learning.
WHAT YOU'LL BRING:
Bachelor's Degree in Computer Science, Management Information Systems, or a related field, or equivalent work experience.
Must have a minimum of 5 years of Salesforce Commerce development experience (CloudCraze or Lightning B2B or B2C), of which at least two years in Salesforce Commerce Lightning.
Proven experience leading technical teams and driving solution design.
Deep knowledge of Apex, LWC, Visualforce, Salesforce APIs, SOQL/SOSL, and Salesforce customization.
Hands-on experience with Salesforce configuration (objects, flows, automations).
Strong grasp of B2B or B2C commerce fundamentals-product catalogs, carts, pricing, orders, and checkout.
Some headless full-stack development experience appreciated (ie, Angular, React, Node.js, etc.)
Integration experience with external systems (ERP, CRM, etc.).
BONUS POINTS FOR EXPERIENCE IN:
Salesforce Certifications: B2B Commerce Developer, Platform Developer II, Application Architect, or CTA track.
Familiarity with CI/CD tools like Git, Jenkins, or Copado.
Experience working in Agile/Scrum environments.
Front-end dev skills (JavaScript, HTML, CSS).
Apply Today! CP # 8495
$165k yearly 3d ago
Oracle APEX Developer
Crystal Management | CMIT
Remote job
CMiT is seeking an experienced Oracle Application Express (APEX) Developer to join our team in support of Task Order 2 (TO2) under the PPM program, aiding the transition of the PRISM system into the PROMIS environment. THIS IS A FULLY REMOTE OPPORTUNITY.
The APEX Developer is an information technology professional who analyzes, designs, codes, implements, tests and supports Oracle Application Express custom built application environments. The APEX Developer designs complementary solutions for transformed environments, implements solutions, and troubleshoots any residual issues pre implementation associated APEX and Flows workflow engine extension integration.
The APEX Developer will concentrate on detailed APEX development related duties such as application architecture, application design and extraneous dependencies. The APEX Developer performs application design reviews managing towards best practices and standards and may help develop detailed logical data models and application frameworks for internal and customer facing APEX applications. APEX Developer must have the ability to architect, design, install, and configure large, complex APEX applications with replication and high-availability on APEX versions (22.x).
Support activities include analyzing, testing, and implementing application designs to support various business applications.
Responsibilities
Participate in all aspects related to Oracle APEX development and support including development in APEX Workflows. Oracle Spatial Data experience is highly preferred.
Develop application in an Oracle Apex 22.X environment
Customize Apex applications for customer facing and modify as necessary
Participate in development of APEX Workflows
Complete comprehensive solution development.
Works with stakeholders on interpretation/translation of functional requirements into system requirements
Creates appropriate technical artifacts to support development and operations support within SDLC guidelines and application/architecture diagrams and logic flows
Writes quality code that meets standards and delivers desired functionality using the technology selected for the project and delivers easy-to-operate systems by performing unit, system, automated testing, and post-deployment validation design. Coordinates user acceptance testing
Adheres to and drives modern software engineering, by applying Agile and DevOps methodologies with an iterative development approach
Troubleshoots application issues by diagnosing and debugging issues within production systems by performing thorough root cause analysis
Maintains and improves technology proficiency with evolving technologies to achieve desired technical and business outcomes
Learns, evaluates, recommends, and adapts to new technologies and techniques
Requires availability for periodic on-call responsibilities
Qualifications
Required
Minimum of 10 years of software development lifecycle experience
Minimum of 3 years with Oracle APEX
Ability to remotely work productively in a remote/telework environment
Education/Certification Required
Bachelor's degree in related field or discipline or equivalent experience
CompTIA Security+CE certification or DoD 8570 IAT Level 2 compliant certification (e.g., CCNA, Security, CySA+, GICSP, GSEC, Security+ CE, CND, SSCP) or higher certification required within 6 months.
Clearance Required
Must be a U.S. Citizen. Must be able to pass a background investigation to obtain or maintain the required security clearance.
Physical Requirements
Office work, typically sedentary with some movement around the office
$77k-105k yearly est. 1d ago
Remote SDR Growth Leader | Scale Global Sales Development
Influxdata 4.3
Remote job
A leading technology company is seeking an experienced SDR Leader to manage and grow their Sales Development team. This position involves developing strategies to meet sales goals, fostering a high-performance culture, and supporting SDRs in their professional growth. Candidates should have 3 to 6 years of experience in sales development, with at least 3 years in a leadership role. The company offers competitive benefits including medical insurance, flexible time off, and a supportive work environment.
#J-18808-Ljbffr
$124k-176k yearly est. 22h ago
Oncology Statistics Lead - Clinical Development (Hybrid)
Allergan 4.8
Remote job
A global biopharmaceutical company in San Francisco seeks an Associate Director, Statistics for Oncology. This role provides statistical leadership for clinical development and life-cycle management, requiring significant experience in statistics or biostatistics. The ideal candidate will have over 10 years in the field, strong leadership abilities, and excellent communication skills. The position offers a hybrid work schedule and a competitive benefits package.
#J-18808-Ljbffr
$126k-162k yearly est. 1d ago
Common Business Services (CBS) - Software Developer, SME (100% Remote -REF1896O)
Citizant 4.5
Remote job
Citizant is a leading provider of professional IT services to the U.S. government. We seek to address some of our country's most pressing challenges in the areas of Agile application development, Enterprise Data Management, Enterprise Architecture, and Program Management support services - focusing on the U.S. Departments of Homeland Security and Treasury. We strive to hire only ethical, talented, passionate, and committed "A Players" who already align with the company's core values: Drive, Excellence, Reputation, Responsibility, and a Better Future. No matter how large we grow, Citizant will retain its collaborative, supportive, small-company culture, where successful team effort to address external and internal customer challenges is valued above all individual contributions.
Job Description
10+ years of professional experience including 5-7 years in the design, development, testing, and maintenance of complex software applications.
Responsible for architecting and implementing high-quality, scalable, and secure software solutions that meet mission and business requirements.
Java, Java EE/SE, JAXB, JSP, JSF, React, Hibernate, Web Services, HTML
Spring, SpringBoot
RESTful web services
Linux/Sun/Solaris
Drools Developer, BRMS/Drools, Business Rules Engine (BRE- utilizing REDHAT Drools)
Qualifications
Education:
Bachelor's degree in a relevant field of study (preferred)
Clearance:
U.S. citizenship is required
Public Trust or have the ability to obtain one.
Salary Range:
The expected pay range for this position is up to $130,000 yearly.
The exact pay rate will vary based on skills, experience, and location.
Citizant offers a competitive benefits package, including:
Medical, dental, and vision insurance
401(k)
Generous PTO
Company-paid life and disability insurance
Flexible Spending Accounts (FSA)
Employee Assistance Program (EAP)
Tuition Assistance & Professional Development Program
Additional Information
Citizant strives to be an employer of choice in the Washington metropolitan area. Citizant associates accept challenging and rewarding work and in return receive excellent compensation and benefits, as well as the opportunity for personal and professional development.
Citizant is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
$130k yearly 2d ago
Modeling & Simulation Software Developer
LMI Consulting, LLC 3.9
Remote job
Job ID 2026-13529 # of Openings 3 Category Data/Analytics Benefit Type Salaried High Fringe/Full-Time
LMI is seeking a skilled Forward Deployed Engineer in the Washington, DC region to join one of our Modeling & Simulation customer development teams. Successful Forward Deployed Engineers demonstrate competency in integrating advanced technologies into existing business processes, building new functionality, ensuring successful software development, analysis, or deployments, and working closely with customers. This role requires a combination of technical skills, strong communication, and the ability to quickly turn ideas into production caliber code. This role will work as part of the Sales Engineering team and engage directly with customers on-site, in a hybrid format, or remotely, as required by each customer including the potential for occasional short-term travel to customer locations across the United States.
This is a 100% remote role with quarterly travel for in person team planning and collaboration events. This position requires an active DoD Secret clearance. You must be a U.S. citizen.
LMI is a new breed of digital solutions provider dedicated to accelerating government impact with innovation and speed. Investing in technology and prototypes ahead of need, LMI brings commercial-grade platforms and mission-ready AI to federal agencies at commercial speed.
Leveraging our mission-ready technology and solutions, proven expertise in federal deployment, and strategic relationships, we enhance outcomes for the government, efficiently and effectively. With a focus on agility and collaboration, LMI serves the defense, space, healthcare, and energy sectors-helping agencies navigate complexity and outpace change. Headquartered in Tysons, Virginia, LMI is committed to delivering impactful results that strengthen missions and drive lasting value.
Why Join Us
At LMI, you will work on innovative solutions that directly impact national security, logistics, and decision-making for some of the government's most complex challenges. You will be part of a collaborative team building a secure, cloud-native platform that redefines how modeling and simulation informs operations.
Responsibilities
Design, implement, and maintain analytical models within our M&S capability called RAPTR.
Perform testing and validation of models and code to ensure accurate outputs.
Perform operations research analyses with RAPTR to answer client questions and recommend efficient courses of action
Participate in client meetings to understand requirements and to brief your technical findings and analyses.
Collaborate with other RAPTR software engineers, modelers, and analysts to optimize performance for large-scale simulations.
Implement monitoring, logging, and alerting to ensure system reliability and availability.
Troubleshoot issues across cloud infrastructure, networks, and simulation workloads.
Stay current with advances in cloud technologies and recommend improvements to enhance RAPTR capabilities.
Qualifications
Minimum Qualifications:
Active DoD Secret security clearance.
Bachelor's degree in Computer Science, ORSA, Data Science, or related field (or equivalent work experience).
5+ years in performing complex data analytics such as building AI/ML models, or physics-based models.
Previous experience with modeling and simulation (M&S) tools
Experience with briefing and engaging with DoD customers.
Strong problem-solving skills and ability to work in a collaborative, fast-paced environment.
Desired Qualifications:
Experience supporting modeling and simulation platforms or compute-intensive applications.
Experience with cloud infrastructure (AWS, Azure, or GCP).
Ability to obtain TS/SCI clearance
Knowledge of DoD cloud security requirements (IL4/IL5, FedRAMP, CMMC).
Experience working in Agile development teams.
LMI is an Equal Opportunity Employer. LMI is committed to the fair treatment of all and to our policy of providing applicants and employees with equal employment opportunities. LMI recruits, hires, trains, and promotes people without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, pregnancy, disability, age, protected veteran status, citizenship status, genetic information, or any other characteristic protected by applicable federal, state, or local law. If you are a person with a disability needing assistance with the application process, please contact
Colorado Residents: In any materials you submit, you may redact or remove age-identifying information such as age, date of birth, or dates of school attendance or graduation. You will not be penalized for redacting or removing this information.
Need help finding the right job?
We can recommend jobs specifically for you!
Click here to get started.
$78k-102k yearly est. 22h ago
Database developer Remote
Lockheed Martin 4.8
Remote job
Database developer to support front end systems (as needed by developers across the organization, in support of web services, third party, or internal development needs) to the exclusion of reporting needs by other departments. Developed code includes but is not limited to PL/SQL in the form of Triggers, Procedures, Functions, & Materialized Views. Generates custom driven applications for intra-department use for business users in a rapid application development platform (primarily APEX). Responsible for functional testing and deployment of code through the development life cycle. Works with end-users to obtain business requirements. Responsible for developing, testing, improving, and maintaining new and existing processes to help users retrieve data effectively. Collaborates with administrators and business users to provide technical support and identify new requirements.
Responsibilities
Responsibilities:
Design stable, reliable and effective database processes.
Solve database usage issues and malfunctions.
Gather user requirements and identify new features.
Provide data management support to users.
Ensure all database programs meet company and performance requirements.
Research and suggest new database products, services, and protocols.
Requirements and skills
In-depth understanding of data management (e.g. permissions, security, and monitoring)
Excellent analytical and organization skills
An ability to understand front-end user requirements and a problem-solving attitude
Excellent verbal and written communication skills
Assumes responsibility for related duties as required or assigned.
Stays informed regarding current computer technologies and relational database management systems with related business trends and developments.
Consults with respective IT management in analyzing business functions and management needs and seeks new and more effective solutions. Seeks out new systems and software that reduces processing time and/or provides better information availability and decision-making capability.
Job Type: Full-time
Pay: From $115,000- 128,000 yearly
Expected hours: 40 per week
Benefits:
Dental insurance
Health insurance
Paid time off
Vision insurance
Paid time off (PTO)
Various health insurance options & wellness plans
Required Knowledge
Considerable knowledge of on-line and design of computer applications.
Require Experience
One to three years of database development/administration experience.
Skills/Abilities
Strong creative and analytical thinking skills.
Well organized with strong project management skills.
Good interpersonal and supervisory abilities.
Ability to train and provide aid others.
$115k-128k yearly 60d+ ago
Senior Data Engineer
Roo 3.8
Remote job
What We Do We're on a mission to empower animal healthcare professionals with opportunities to earn more and achieve greater flexibility in their careers and personal lives. Powered by groundbreaking technology, Roo has built the industry-leading veterinary staffing platform, connecting Veterinarians, Technicians, and Assistants with animal hospitals for relief work and hiring opportunities. Roo empowers the largest network of over 20,000 veterinary professionals to help more than 9,000 animal hospitals provide quality care to more pets.
Together, we've provided more than 3 million hours of healthcare, helping Veterinarians earn more than $200 million.About the Role
Data is at the core of what we do at Roo. Our growing data team is tight-knit, essential, and recognized for the high-quality, innovative work we deliver. You will be right at the center of this work as we build and maintain the systems that power Roo's most impactful initiatives. This role will push you to use every part of your data and analytics engineering skill set to develop an extensible data ecosystem that serves humans, machine learning models, and internal AI agents alike.
Your Responsibilities
Data Pipelines & Integrations: Design, develop, and maintain reliable end-to-end data pipelines (both batch and streaming) that connect internal and external systems in ways that best support marketplace growth, customer experience, and operational efficiency.
Data Storage, Warehousing & Database Support: Contribute to the performance, scalability, and reliability of our entire data ecosystem. Cultivate our dbt/Snowflake environment, develop and maintain our data-centric AWS assets, and partner with product engineers to support the health and efficiency of our transactional databases.
Data Transformation & Analytics Support (dbt): Work with analysts and other data stakeholders to engineer data structures and orchestrate workflows that encode core business logic. Produce clean, well-structured datasets that underpin traditional reporting, analyst experimentation, and ML and agentic AI use cases.
Data Quality, Governance & Metric Trust: Implement observability, testing, monitoring, validation, and documentation to ensure accuracy, stability, and consistency throughout the data stack. Help shape shared definitions, metrics, and data semantics across the company.
Business Collaboration & Insight Enablement: Join cross-functional squads and tiger teams to rapidly translate evolving data needs into scalable and extensible data models, metrics, and analytical frameworks. You will favor iterative delivery over one-shot solutions to support fast-moving OKRs and drive meaningful incremental progress week to week.
Technical Expertise & Mentorship: Bring strong expertise in modern code quality, data modeling, and data stack patterns. Mentor data stakeholders throughout the organization, share best practices, and meaningfully contribute to architectural and tooling decisions as the data stack evolves.
Qualifications
Expert-level SQL and data modeling skills (5+ years of experience)
Intermediate proficiency with data-centric Python packages and Node.js data interaction frameworks like Kysely, Prisma, and Sequelize
Deep experience with Snowflake, dbt, MySQL, and AWS data services
About You
You care deeply about data quality, scalability, and clarity of purpose. You take pride in crafting systems that other engineers and analysts enjoy using and extending.
You collaborate naturally with product engineers, data analysts, and business stakeholders, and you are comfortable translating ambiguity into clear technical plans.
You are resilient and adaptable. You don't lose your footing when priorities shift, you work well with uncertainty and experimentation, and you make thoughtful decisions even when speed matters.
You thrive in fast-growing environments and value iterative development. You know how to deliver impact quickly while still building toward a healthy, extensible stack.
You bring experience across multiple business domains, such as product, marketing, sales, finance, and operations.
You enjoy mentoring teammates, raising the technical bar, and contributing thoughtful perspectives to architectural decisions.
You handle multiple simultaneous priorities well, communicate clearly, and maintain crisp expectation-setting with partners across the company.
Exact compensation may vary based on skills, experience, and location.
California pay range$170,000-$220,000 USDNew York pay range$170,000-$220,000 USDWashington pay range$180,000-$200,000 USDColorado pay range$145,000-$190,000 USDTexas pay range$145,000-$190,000 USDNorth Carolina pay range$135,000-$175,000 USD Core Values Our Core Values are what shape us as an organization and we're looking for people who exhibit the same values in their professional life; Bias to Urgency, Drive Measurable Impact, Seek Understanding, Solve Customer Problems and Have Fun! What to expect from working at Roo! For permanent, full time employees, we offer:
Accelerated growth & learning potential.
Stipends for home office setup, continuing education, and monthly wellness.
Comprehensive health benefits to fit your needs with base medical plan covered at 100% with optional premium buy up plans.
401K
Unlimited Paid Time Off.
Paid Maternity/Paternity and reproductive care leave.
Gifts on your birthday & anniversary.
Opportunity for domestic travel, including for regional team building events.
Overall, you would be part of a mission-driven company that will significantly empower the lives of all veterinary professionals and the health of the overall animal industry that seeks massive innovation. We have diverse, passionate & driven team members from a variety of backgrounds, and Roo is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status. We are committed to creating an inclusive environment for all employees and candidates. We understand that your individual experience may not check every box but we still encourage you to apply even if you are not confident in every expectation listed. Ready to join the Roo-volution?!
$180k-200k yearly Auto-Apply 7d ago
Backend Developer - Database - USA(Remote)
Photon Group 4.3
Remote job
Greetings Everyone
Who are we?
For the past 20 years, we have powered many Digital Experiences for the Fortune 500. Since 1999, we have grown from a few people to more than 4000 team members across the globe that are engaged in various Digital Modernization. For a brief 1 minute video about us, you can check *****************************
What will you do? What are we looking for?
Requirement is for a DB/BE Candidate with strong in SQL and PL/SQL skills Position Summary
We are seeking a highly skilled backend-focused Staff Software Engineer to join our team. The ideal candidate will have extensive experience in backend development, system design, and a strong understanding of cloud-native software engineering principles.
Responsibilities
Develop backend services using Java and Spring Boot
Design and implement solutions deployed on Google Cloud Platform (GKE)
Work with distributed systems, including Google Cloud Spanner (Postgres dialect) and Confluent Kafka (or similar pub/sub tools)
Design, optimize, and troubleshoot complex SQL queries and stored procedures (e.g., PL/SQL) to support high-performance data operations and ensure data integrity across applications.
Collaborate with teams to implement CI/CD pipelines using GitHub Actions and Argo CD
Ensure high performance and reliability through sound software engineering practices
Mentor and provide technical leadership to the frontend engineering team
Required Qualifications
7+ years' experience in software engineering from ideation to production deployment of IT solutions
5+ years' experience in full software development life cycle including ideation, coding, coding standards, testing, code reviews and production deployments
5+ years of experience with backend Java , Spring Boot and Microservices
3+ years of hands-on experience with a public cloud provider
3+ years working with pub/sub tools like Kafka or similar
3+ years of experience with database design/development (Postgres or similar)
2+ years of experience with CI/CD tools (GitHub Actions, Jenkins, Argo CD, or similar)
Preferred Qualifications
Demonstrated experience with development and deployment of Minimum Viable Products (MVPs)
Must demonstrate innovative mindset, divergent thinking, and convergent actions.
Familiarity with Kubernetes concepts; experience deploying services on GKE is a plus
Compensation, Benefits and Duration
Minimum Compensation: USD 44,000
Maximum Compensation: USD 154,000
Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role.
Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees.
This position is available for independent contractors
No applications will be considered if received more than 120 days after the date of this post
$80k-109k yearly est. Auto-Apply 44d ago
Database Developer 1 (Remote)
Apidel Technologies 4.1
Remote job
Prepares, defines, structures, develops, implements, and maintains database objects. Analyze query performance, identify bottlenecks, and implement optimization techniques. Defines and implements interfaces to ensure that various applications and user-installed or vendor-developed systems interact with the required database systems.
Creates database structures, writing and testing SQL queries, and optimizing database performance.
Plans and develops test data to validate new or modified database applications.
Work with business analysts, and other stakeholders to understand requirements and integrate database solutions.
Build and implement database systems that meet specific business requirements ensuring data integrity and security, as well as troubleshooting and resolving database issues.
Design and implement ETL pipelines to integrate data from various sources using SSIS.
Responsible for various SQL jobs.
Skills Required
Strong understanding of SQL and DBMS like MySQL, PostgreSQL, or Oracle.
Ability to design and model relational databases effectively.
Skills in writing and optimizing SQL queries for performance.
Ability to troubleshoot and resolve database-related issues.
Ability to communicate technical information clearly and concisely to both technical and non-technical audiences.
Ability to collaborate effectively with other developers and stakeholders.
Strong ETL experience specifically with SSIS.
Skills Preferred
Azure experience is a plus
.Net experience is a plus
GITHub experience is a plus
Experience Required
2 years of progressively responsible programming experience or an equivalent combination of training and experience.
Education Required
Bachelor`s degree in Information Technology or Computer Science or equivalent experience
$94k-121k yearly est. 5d ago
IS Database Developer II
Careoregon 4.5
Remote job
---------------------------------------------------------------
The IS Database Developer II is responsible for developing and maintaining database and ETL processes, as well as recommending and partnering in the design and development of effective solutions in support of business strategies. This role is essential toward maturing CareOregon's database and ETL development model. This position spends substantial time evaluating, architecting, and implementing IS priorities (plan, design, install, and maintain).
Estimated Hiring Range:
$111,690.00 - $136,510.00
Bonus Target:
Bonus - SIP Target, 5% Annual
Current CareOregon Employees: Please use the internal Workday site to submit an application for this job.
---------------------------------------------------------------
Essential Responsibilities
Database Development
Actively participate in the design of custom databases and processes.
Provide advanced database design support to the organization; lead small projects with assistance from Supervisor or Lead and participate and consult on other projects.
Collaborate with other IS teams on best practices of database design and development.
ETL DevelopmentDevelop ETL processes for moderate to advanced activities.
Develop moderate to advanced databases to meet application and web needs.
Analyze business requirements; research and recommend solutions which include potential risks and mitigation.
Develop and maintain appropriate technology documentation, including current design and operation.
Standards and Policy Administration
Propose requirements, standards and best practices for database and ETL development.
Participate in the ongoing review of existing systems to ensure they are designed to comply with established standards and to empower business operations.
Vendor Coordination and Relations
Conduct product and vendor research, and present recommendations to more advanced database developers and/or management.
Establish and maintain effective working relationships with vendors and related equipment suppliers, including installation and repair of services.
Experience and/or Education
Required
Minimum 3 years of database and ETL development required. Experience should include some or all of the following:
Database development and maintenance
ETL development and maintenance
Systems analysis and design
Agile/Scrum methodology
Note: For data warehouse focused roles, minimum 3 years' experience developing ETL for loading a dimensional model using a combination of T-SQL and SSIS 2012, 2014, or 2016
Preferred
Bachelor's degree in Computer Science, Information Systems, or a related field
Additional experience in related technology support and/or operational positions
QNXT experience
Knowledge, Skills and Abilities Required
Knowledge
Working knowledge/skills with the following:
Microsoft SQL Server
ETL tools, such as SSIS or Informatica
Visual Studio
Unit and integration testing
Note: For data warehouse focused roles, advanced knowledge/skills of the dimensional model required in lieu of knowledge/skill requirements above
General knowledge of BizTalk (preferred)
Skills and Abilities
Advanced abilities in troubleshooting system performance issues and root cause
Effective communication skills, including listening, verbal, written, and customer service
Ability to clearly articulate policies and instructions
Demonstrated progress in conveying appropriate level of detail effectively to all levels of the organization including non-technical staff
Ability to recommend policies, document risks, and propose solutions to information technology management and senior leadership
Possess a high degree of initiative and motivation
Ability to effectively collaborate with coworkers, staff, and leaders across all departments
Ability to work effectively with diverse individuals and groups
Ability to learn, focus, understand, and evaluate information and determine appropriate actions
Ability to accept direction and feedback, as well as tolerate and manage stress
Ability to see, read, and perform repetitive finger and wrist movement for at least 6 hours/day
Ability to hear and speak clearly for at least 3-6 hours/day
Working Conditions
Work Environment(s): ☒ Indoor/Office ☐ Community ☐ Facilities/Security ☐ Outdoor Exposure
Member/Patient Facing: ☒ No ☐ Telephonic ☐ In Person
Hazards: May include, but not limited to, physical and ergonomic hazards.
Equipment: General office equipment
Travel: May include occasional required or optional travel outside of the workplace; the employee's personal vehicle, local transit or other means of transportation may be used.
Work Location: Work from home
We offer a strong Total Rewards Program. This includes competitive pay, bonus opportunity, and a comprehensive benefits package. Eligibility for bonuses and benefits is dependent on factors such as the position type and the number of scheduled weekly hours. Benefits-eligible employees qualify for benefits beginning on the first of the month on or after their start date. CareOregon offers medical, dental, vision, life, AD&D, and disability insurance, as well as health savings account, flexible spending account(s), lifestyle spending account, employee assistance program, wellness program, discounts, and multiple supplemental benefits (e.g., voluntary life, critical illness, accident, hospital indemnity, identity theft protection, pre-tax parking, pet insurance, 529 College Savings, etc.). We also offer a strong retirement plan with employer contributions. Benefits-eligible employees accrue PTO and Paid State Sick Time based on hours worked/scheduled hours and the primary work state. Employees may also receive paid holidays, volunteer time, jury duty, bereavement leave, and more, depending on eligibility. Non-benefits eligible employees can enjoy 401(k) contributions, Paid State Sick Time, wellness and employee assistance program benefits, and other perks. Please contact your recruiter for more information.
We are an equal opportunity employer
CareOregon is an equal opportunity employer. The organization selects the best individual for the job based upon job related qualifications, regardless of race, color, religion, sexual orientation, national origin, gender, gender identity, gender expression, genetic information, age, veteran status, ancestry, marital status or disability. The organization will make a reasonable accommodation to known physical or mental limitations of a qualified applicant or employee with a disability unless the accommodation will impose an undue hardship on the operation of our organization.
$111.7k-136.5k yearly Auto-Apply 29d ago
Database Developer
Oddball 3.9
Remote job
Oddball believes that the best products are built when companies understand and value the things they are working on. We value learning and growth and the ability to make a big impact at a small company. We believe that we can make big changes happen and improve the daily lives of millions of people by bringing quality software to the federal space.
We are seeking a Database Developer to design, build, and maintain secure, scalable data pipelines that enable effective use of enterprise data. In this role, you'll collaborate with engineers, analysts, and data stewards to deliver reliable datasets and models that support analytics, reporting, and decision-making.
What You'll Be Doing
You'll design, build, and maintain database and data integration solutions that support enterprise intelligence and data delivery efforts. This includes developing and optimizing database structures, implementing ETL pipelines, and supporting data ingestion and transformation across multiple systems. You'll help ensure data is delivered accurately, efficiently, and securely to downstream users and platforms, while supporting integration efforts across Military Health, readiness, and federal health data sources.
What you'll bring:
Experience developing databases and/or ETL pipelines in enterprise environments
Strong SQL skills and familiarity with data modeling concepts
Experience integrating data across disparate systems and formats
Understanding of data lifecycle management and performance optimization
Ability to collaborate with analysts, platform teams, and stakeholders
Comfort working in structured delivery environments with defined methodologies
Exposure to machine learning pipelines or advanced analytics integration.
Prior experience supporting DHA or other federal healthcare programs.
Performs other related duties as assigned.
Requirements:
Applicants must be authorized to work in the United States. In alignment with federal contract requirements, certain roles may also require U.S. citizenship and the ability to obtain and maintain a federal background investigation and/or a security clearance.
Education:
Bachelor's Degree
Benefits:
Fully remote
Tech & Education Stipend
Comprehensive Benefits Package
Company Match 401(k) plan
Flexible PTO, Paid Holidays
Oddball is an Equal Opportunity Employer and does not discriminate against applicants based on race, religion, color, disability, medical condition, legally protected genetic information, national origin, gender, sexual orientation, marital status, gender identity or expression, sex (including pregnancy, childbirth or related medical conditions), age, veteran status or other legally protected characteristics. Any applicant with a mental or physical disability who requires an accommodation during the application process should contact an Oddball HR representative to request such an accommodation by emailing *************
Compensation:
At Oddball, it's important each employee is compensated competitively and fairly. In alignment with state legal requirements. A range for the included position is listed below. Be advised, actual offer details are determined by job category, job location, and candidate skill level.
United States Wage Range: $90,000 - $130,000
$90k-130k yearly Auto-Apply 14d ago
PostgreSQL Database Developer
Contact Government Services, LLC
Remote job
PostgreSQL Database DeveloperEmployment Type: Full Time, Experienced level Department: Information Technology CGS is seeking a PostgreSQL Database Developer to join our team supporting a rapidly growing Data Analytics and Business Intelligence platform focused on providing data solutions that empower our federal customers. You will support a migration from the current Oracle database to a Postgres database and manage the database environments proactively. As we continue our growth, you will play a key role in ensuring scalability of our data systems.
CGS brings motivated, highly skilled, and creative people together to solve the government's most dynamic problems with cutting-edge technology. To carry out our mission, we are seeking candidates who are excited to contribute to government innovation, appreciate collaboration, and can anticipate the needs of others. Here at CGS, we offer an environment in which our employees feel supported, and we encourage professional growth through various learning opportunities.
Skills and attributes for success:- Drive efforts to migrate from the current Oracle database to the new Microsoft Azure Postgres database- Create and maintain technical documentation, using defined technical documentation templates, as well as gain an in-depth knowledge of the business data to propose and implement effective solutions- Collaborate with internal and external parties to transform high-level technical objectives into comprehensive technical requirements- Ensure the availability and performance of the databases that support our systems, ensuring that they have sufficient resources allocated to support high resilience and speed.- Perform and assist developers in performance tuning- Proactively monitor the database systems to ensure secure services with minimum downtime and improve maintenance of the databases to include rollouts, patching, and upgrades- Create and maintain technical documentation using defined technical documentation templates, as well as gaining an in-depth knowledge of the business data to propose and implement effective solutions- Work within a structured and Agile development approach
Qualifications:- Bachelor's degree- Must be US Citizenship- 7 years of experience with administrating PostgreSQL Databases in Linux environments- Experience with setting up, monitoring, and maintaining PostgreSQL instances- Experience with implementing and maintaining PostgreSQL backup and disaster recovery processes- Experience migrating Oracle schema, packages, views, triggers to Postgres using Ora2Pg tool
Ideally, you will also have:- Experience implementing and maintaining data warehouses- Experience with AWS RDS for PostgreSQL- Experience with Oracle databases- Experience leveraging the Ora2Pg tool- Experience with working in cloud environments such as Azure and/or AWS- Prior federal consulting experience
Our Commitment:Contact Government Services (CGS) strives to simplify and enhance government bureaucracy through the optimization of human, technical, and financial resources. We combine cutting-edge technology with world-class personnel to deliver customized solutions that fit our client's specific needs. We are committed to solving the most challenging and dynamic problems.
For the past seven years, we've been growing our government-contracting portfolio, and along the way, we've created valuable partnerships by demonstrating a commitment to honesty, professionalism, and quality work.
Here at CGS we value honesty through hard work and self-awareness, professionalism in all we do, and to deliver the best quality to our consumers mending those relations for years to come.
We care about our employees. Therefore, we offer a comprehensive benefits package.- Health, Dental, and Vision- Life Insurance- 401k- Flexible Spending Account (Health, Dependent Care, and Commuter)- Paid Time Off and Observance of State/Federal Holidays
Contact Government Services, LLC is an Equal Opportunity Employer. Applicants will be considered without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Join our team and become part of government innovation!
Explore additional job opportunities with CGS on our Job Board:*************************************
For more information about CGS please visit: ************************** or contact:Email: *******************
#CJ
$78k-104k yearly est. Auto-Apply 60d+ ago
Lead Data Engineer & Modeler, AI - Hybrid
Bigcommerce 4.8
Remote job
Welcome to the Agentic Commerce Era
At Commerce, our mission is to empower businesses to innovate, grow, and thrive with our open, AI-driven commerce ecosystem. As the parent company of BigCommerce, Feedonomics, and Makeswift, we connect the tools and systems that power growth, enabling businesses to unlock the full potential of their data, deliver seamless and personalized experiences across every channel, and adapt swiftly to an ever-changing market. Simply said, we help businesses confidently solve complex commerce challenges so they can build smarter, adapt faster, and grow on their own terms. If you want to be part of a team of bold builders, sharp thinkers, and technical trailblazers, working together to shape the future of commerce, this is the place for you.
BigCommerce is building the foundation for the next generation of AI-driven commerce. As a Lead AI Engineer, Platform & Infrastructure, you'll define and scale the systems that make this transformation possible. This role sits at the intersection of data engineering, MLOps, and applied AI enablement, responsible for building the secure, scalable, and high-performance infrastructure that supports AI/ML use cases across the company.
You'll collaborate across product, engineering, and data teams to design the unified AI platform layer - powering internal intelligence, customer-facing AI features, and advanced analytics. From model lifecycle management to data pipelines and inference infrastructure, you'll drive the architecture and operational excellence that allows BigCommerce to experiment, deploy, and iterate AI at scale.
If you're passionate about enabling intelligence through infrastructure, designing modern ML ecosystems, and operationalizing AI across a fast-scaling SaaS platform, this role will put you at the center of BigCommerce's AI evolution.
What You'll Do
AI Platform Architecture
Partner with the Enterprise Architect and Principal Data Architect to design the company-wide AI/ML platform strategy across GCP and AWS.
Build scalable systems for model training, evaluation, deployment, and monitoring.
Define best practices for data ingestion, feature stores, vector databases, and model registries.
Integrate AI workflows into existing analytics and product pipelines.
Infrastructure & Reliability
Implement CI/CD for ML pipelines (MLOps) including model versioning, validation, and automated deployment.
Ensure platform reliability, observability, and performance at enterprise scale.
Manage GPU/TPU resources and optimize compute efficiency for training and inference workloads.
Contribute to cost-optimization and security best practices across the AI infrastructure.
Cross-Functional Collaboration
Partner with data scientists, applied ML engineers, and product teams to translate model requirements into scalable architecture.
Work closely with the data engineering team to ensure AI pipelines align with governance and data quality standards.
Collaborate with software engineers to integrate AI services and APIs into production systems.
Governance & Responsible AI
Champion data and model governance, including lineage, reproducibility, and compliance (GDPR, SOC, ISO).
Establish monitoring frameworks for model drift, bias detection, and ethical AI use.
Build secure and transparent systems that support trust in AI-driven decisions.
What You'll Bring
7+ years in data or ML engineering, with experience designing production-grade AI infrastructure.
Strong technical foundation in MLOps, data pipelines, and distributed systems.
Hands-on experience with:
Cloud AI platforms (Vertex AI, SageMaker, Bedrock, or equivalent)
Orchestration frameworks (Airflow, Kubeflow, MLflow, or Metaflow)
Cloud data stacks (BigQuery, Snowflake, GCS/S3, Terraform)
Model serving tools (FastAPI, BentoML, Ray Serve, or Triton Inference Server)
Proficient in: Python, SQL, and Git-based CI/CD.
Experience integrating LLMs and vector databases (e.g., Pinecone, FAISS, Weaviate, Vertex Matching Engine).
Familiarity with Kubernetes, Docker, and Terraform for scalable deployment.
Strong communication skills, able to partner across disciplines and simplify complex technical systems.
What You'll Impact
The AI foundation powering every intelligent capability within BigCommerce - from predictive analytics to generative assistants.
The tools and frameworks that enable product and engineering teams to build, test, and ship AI faster.
The reliability, governance, and scalability of BigCommerce's enterprise-wide AI ecosystem.
Why Join Us
You'll play a critical role in shaping how BigCommerce operationalizes AI - not just as a feature, but as a platform capability. You'll join a collaborative, ambitious, and fast-evolving data organization dedicated to creating systems that enable intelligence at scale.
#LI-GL1
#LI-HYBRID
(Pay Transparency Range: $116,000-$174,000)
The exact salary will be dependent on the successful candidate's location, relevant knowledge, skills, and qualifications.
Inclusion and Belonging
At Commerce, we believe that celebrating the unique histories, perspectives and abilities of every employee makes a difference for our company, our customers and our community. We are an equal opportunity employer and the inclusive atmosphere we build together will make room for every person to contribute, grow and thrive.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the interview process, to perform essential job functions and to receive other benefits and privileges of employment. If you need an accommodation in order to interview at Commerce, please let us know during any of your interactions with our recruiting team.
Learn more about the Commerce team, culture and benefits at *********************************
Protect Yourself Against Hiring Scams: Our Corporate Disclaimer
Commerce, along with many other employers, has become the subject of fraudulent job offers to hopeful prospective job seekers.
Be advised:
Commerce does not offer jobs to individuals who do not go through our formal hiring process.
Commerce will never:
require payment of recruitment fees from candidates;
request personally identifiable information through unsanctioned websites or applications;
attempt to solicit money from you as part of the hiring process or as part of an employment offer;
solicit money to complete visa requirements as part of a job offer.
If you receive unsolicited offers of employment from Commerce, we urge you to be extremely cautious and avoid engaging or responding.
$116k-174k yearly Auto-Apply 60d+ ago
Data Engineer II
Capital Rx 4.1
Remote job
About Judi Health
Judi Health is an enterprise health technology company providing a comprehensive suite of solutions for employers and health plans, including:
Capital Rx, a public benefit corporation delivering full-service pharmacy benefit management (PBM) solutions to self-insured employers,
Judi Health™, which offers full-service health benefit management solutions to employers, TPAs, and health plans, and
Judi , the industry's leading proprietary Enterprise Health Platform (EHP), which consolidates all claim administration-related workflows in one scalable, secure platform.
Together with our clients, we're rebuilding trust in healthcare in the U.S. and deploying the infrastructure we need for the care we deserve. To learn more, visit ****************
Location: Remote (For Non-Local) or Hybrid (Local to NYC area or Denver, CO)
Position Summary:
We are seeking a highly motivated and talented Data Engineer to join our team and play a critical role in shaping the future of healthcare data management. This individual will be a key contributor in building robust, scalable, and accurate data systems that empower operational and analytics teams to make informed decisions and drive positive outcomes.
Position Responsibilities:
Lead relationship with operational and analytics teams to translate business needs into effective data solutions
Architect and implement ETL workflows leveraging CapitalRx platforms and technologies such as Python, dbt, SQLAlchemy, Terraform, Airflow, Snowflake, and Redshift
Conduct rigorous testing to ensure the flawless execution of data pipelines before production deployment
Identify, recommend, and implement process improvement initiatives.
Proactively identify and resolve data-related issues, ensuring system reliability and data integrity
Lead moderately complex projects.
Provide ongoing maintenance and support for critical data infrastructure, including 24x7 on-call availability
Responsible for adherence to the Capital Rx Code of Conduct including reporting of noncompliance.
Required Qualifications:
Bachelor's degree in Computer Science, Information Technology, or a related field
2+ experience working with Airflow, dbt, and Snowflake
Expertise in data warehousing architecture techniques and familiarity with Kimball methodology
Minimum 3+ years experience with a proven track record as a Data Engineer, displaying the ability to design, implement, and maintain complex data pipelines
1+ year experience in Python, SQL
Capacity to analyze the company's broader data landscape and architect scalable data solutions that support growth
Excellent communication skills to collaborate effectively with both technical and non-technical stakeholders
A self-motivated and detail-oriented individual with the ability to tackle and solve intricate technical challenges
Preferred Qualifications:
1-3 years of experience as a Data Engineer, ideally in the healthcare or PBM sector
Advanced proficiency with Airflow, dbt, and Snowflake, coupled with 3+ years of SQL development and Python experience
This range represents the low and high end of the anticipated base salary range for the NY-based position. The actual base salary will depend on several factors such as experience, knowledge, and skills, and if the location of the job changes.
This position description is designed to be flexible, allowing management the opportunity to assign or reassign duties and responsibilities as needed to best meet organizational goals.
Salary Range$120,000-$140,000 USD
All employees are responsible for adherence to the Capital Rx Code of Conduct including the reporting of non-compliance. This position description is designed to be flexible, allowing management the opportunity to assign or reassign duties and responsibilities as needed to best meet organizational goals.
Judi Health values a diverse workplace and celebrates the diversity that each employee brings to the table. We are proud to provide equal employment opportunities to all employees and applicants for employment and prohibit discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, medical condition, genetic information, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By submitting an application, you agree to the retention of your personal data for consideration for a future position at Judi Health. More details about Judi Health's privacy practices can be found at *********************************************
$120k-140k yearly Auto-Apply 6d ago
Data Engineer (Remote, Continental United States)
ICA.Ai 4.7
Remote job
About ICA, Inc.
International Consulting Associates, Inc. is a rapidly growing company, located in the D.C./Metro area. We were founded in 2009 to assist government clients with evaluating and achieving their objectives. We have become a trusted advisor helping our clients by offering cutting-edge innovation and solutions to complex projects. Our small company has grown significantly, and we're overjoyed at the opportunity to expand yet again!
We are results-focused and have a proven track record supporting federal agencies and large government services primes in three main areas: Research and Data Analysis, Advanced-Data Science, and Strategic Services. We currently support multiple analytics and research programs across HHS.
At ICA, we believe our success starts with our people. We foster a collaborative "one team" environment where work-life balance isn't just talked about - it's prioritized. We're building dynamic, highly skilled teams in a welcoming and supportive atmosphere. If you're passionate about using your technical expertise to make a difference, we want to talk to you.
We are looking for a Data Engineer to join our growing team!
ABOUT THE ROLE:
We are seeking an experienced Data Engineer to build and maintain data infrastructure supporting ICA's federal agency clients, including the FDA. You'll develop pipelines and platforms that transform raw data into actionable insights, working with analysts, data scientists, and developers in an agile environment.
ABOUT YOU:
As a data engineer with software development expertise you bring a blend of analytical rigor and coding craftsmanship to every project. You excel at designing scalable data pipelines, optimizing performance, and ensuring data integrity across complex systems. Your strong programming skills allow you to build robust tools and services that empower data-driven decision-making. You collaborate seamlessly with cross-functional teams, translating business needs into technical solutions with clarity and precision. Above all, you are passionate about continuous learning and innovation, always seeking ways to improve systems and deliver value.
RESPONSIBILITIES:
Design and maintain scalable data pipelines and ETL/ELT processes
Build document processing pipelines for text and image extraction
Architect AWS-based data solutions using S3, Glue, Redshift, RDS, ECS, etc
Optimize SQL queries and develop Python-based data processing workflows
Troubleshoot data pipeline issues and implement solutions
Ensure data pipeline performance, scalability, and security
REQUIRED QUALIFICATIONS:
4+ years of experience working with ETL, Data Modeling, and Data Architecture
Expertise in writing and optimizing SQL
Experience with Big Data technologies such as Spark
Intermediate Linux skills
Experience in managing large data warehouses or data lakes
Minimum of 1 year experience with programming in Python
Experience with data and cloud engineering
Knowledge of cloud computing services
Bachelor's degree, or higher
Ability to obtain a Public Trust Clearance
PREFERRED QUALIFICATIONS:
Databricks Lakehouse platform experience
ML pipeline or graph algorithm implementation
Unstructured data processing expertise
BENEFITS:
We invest in our team members so you can live your best life professionally and personally, offering a competitive salary and benefits.
Health Insurance -100% employer-paid premiums - ICA covers the full cost of one of three offered medical plans
Dental Insurance
Vision insurance
Health Spending Account
Flexible Spending Account
Life and Disability insurance
401(k) plan with company match
Paid Time Off (Vacation, Sick Leave and Holidays)
Education and Professional Development Assistance
Remote work from anywhere within the continental United States
LOCATION & TELEWORK
This is a remote position. Candidates residing in the DMV area preferred.
ADDITIONAL INFORMATION:
ICA is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender, gender identity or expression, national origin, genetics, disability status, protected veteran status, age, or any other characteristic protected by state, federal or local laws.
This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
$87k-117k yearly est. Auto-Apply 60d+ ago
Principal Data Engineer - ML Platforms
Altarum 4.5
Remote job
Altarum | Data & AI Center of Excellence (CoE) Altarum is building the future of data and AI infrastructure for public health - and we're looking for a Principal Data Engineer - ML Platforms to help lead the way. In this cornerstone role, you will design, build, and operationalize the modern data and ML platform capabilities that power analytics, evaluation, AI modeling, and interoperability across all Altarum divisions.
If you want to architect impactful systems, enable data science at scale, and help ensure public health and Medicaid programs operate with secure, explainable, and trustworthy AI - this role is for you. What You'll Work On
This role blends deep engineering with applied ML enablement: ML Platform Engineering: modern lakehouse architecture, pipelines, MLOps lifecycle Applied ML enablement: risk scoring, forecasting, Medicaid analytics NLP/Generative AI support: RAG, vectorization, health communications Causal ML operationalization: evaluation modeling workflows Responsible/Trusted AI engineering: model cards, fairness, compliance Your work ensures that Altarum's public health and Medicaid programs run on secure, scalable, reusable, and explainable data and AI infrastructure. What You'll Do
Platform Architecture & Delivery
Design and operate modern, cloud-agnostic lakehouse architecture using object storage, SQL/ELT engines, and dbt.
Build CI/CD pipelines for data, dbt, and model delivery (GitHub Actions, GitLab, Azure DevOps).
Implement MLOps systems: MLflow (or equivalent), feature stores, model registry, drift detection, automated testing.
Engineer solutions in AWS and AWS GovCloud today, with portability to Azure Gov or GCP.
Use Infrastructure-as-Code (Terraform, CloudFormation, Bicep) to automate secure deployments.
Pipelines & Interoperability
Build scalable ingestion and normalization pipelines for healthcare and public health datasets, including:
FHIR R4 / US Core (strongly preferred)
HL7 v2 (strongly preferred)
Medicaid/Medicare claims & encounters (strongly preferred)
SDOH & geospatial data (preferred)
Survey, mixed-methods, and qualitative data
Create reusable connectors, dbt packages, and data contracts for cross-division use.
Publish clean, conformed, metrics-ready tables for Analytics Engineering and BI teams.
Support Population Health in turning evaluation and statistical models into pipelines.
Data Quality, Reliability & Cost Management
Define SLOs and alerting; instrument lineage & metadata; ensure ≥95% of data tests pass.
Perform performance and cost tuning (partitioning, storage tiers, autoscaling) with guardrails and dashboards.
Applied ML Enablement
Build production-grade pipelines for risk prediction, forecasting, cost/utilization models, and burden estimation.
Develop ML-ready feature engineering workflows and support time-series/outbreak detection models.
Integrate ML assets into standardized deployment workflows.
Generative AI Enablement
Build ingestion and vectorization pipelines for surveys, interviews, and unstructured text.
Support RAG systems for synthesis, evaluation, and public health guidance.
Enable Palladian Partners with secure, controlled-generation environments.
Causal ML & Evaluation Engineering
Translate R/Stata/SAS evaluation code into reusable pipelines.
Build templates for causal inference workflows (DID, AIPW, CEM, synthetic controls).
Support operationalization of ARA's applied research methods at scale.
Responsible AI, Security & Compliance
Implement Model Card Protocol (MCP) and fairness/explainability tooling (SHAP, LIME).
Ensure compliance with HIPAA, 42 CFR Part 2, IRB/DUA constraints, and NIST AI RMF standards.
Enforce privacy-by-design: tokenization, encryption, least-privilege IAM, and VPC isolation.
Reuse, Shared-Services, and Enablement
Develop runbooks, architecture diagrams, repo templates, and accelerator code.
Pair with data scientists, analysts, and SMEs to build organizational capability.
Provide technical guidance for proposals and client engagements.
Your First 90 Days - You will make a meaningful impact fast. Expected outcomes include:
Platform skeleton operational: repo templates, CI/CD, dbt project, MLflow registry, tests.
Two pipelines in production (e.g., FHIR → analytics and claims normalization).
One end-to-end CoE lighthouse MVP delivered (ingestion → model → metrics → BI).
Completed playbooks for GovCloud deployment, identity/secrets, rollback, and cost control.
Success Metrics (KPIs)
Pipeline reliability meeting SLA/SLO targets.
≥95% data tests passing across pipelines.
MVP dataset onboarding ≤ 4 weeks.
Reuse of platform assets across ≥3 divisions.
Cost optimization and budget adherence.
What You'll Bring
7-10+ years in data engineering, ML platform engineering, or cloud data architecture.
Expert in Python, SQL, dbt, and orchestration tools (Airflow, Glue, Step Functions).
Deep experience with AWS + AWS GovCloud.
CI/CD and IaC experience (Terraform, CloudFormation).
Familiarity with MLOps tools (MLflow, Sagemaker, Azure ML, Vertex AI).
Ability to operate in regulated environments (HIPAA, 42 CFR Part 2, IRB).
Preferred:
Experience with FHIR, HL7, Medicaid/Medicare claims, and/or SDOH datasets.
Databricks, Snowflake, Redshift, Synapse.
Event streaming (Kafka, Kinesis, Event Hubs).
Feature store experience.
Observability tooling (Grafana, Prometheus, OpenTelemetry).
Experience optimizing BI datasets for Power BI.
Logistical Requirements
At this time, we will only accept candidates who are presently eligible to work in the United States and will not require sponsorship.
Our organization requires that all work, for the duration of your employment, must be completed in the continental U.S. unless required by contract.
If you're near one of our offices (Arlington, VA; Silver Spring, MD; or Novi, MI), you'll join us in person one day every other month (6 times per year) for a fun, purpose-driven Collaboration Day. These days are filled with creative energy, meaningful connection, and team brainstorming!
Must be able to work during Eastern Time unless approved by your manager.
Employees working remotely must have a dedicated, ergonomically appropriate workspace free from distractions with a mobile device that allows for productive and efficient conduct of business.
Altarum is a nonprofit organization focused on improving the health of individuals with fewer financial resources and populations disenfranchised by the health care system. We work primarily on behalf of federal and state governments to design and implement solutions that achieve measurable results. We combine our expertise in public health and health care delivery with technology development and implementation, practice transformation, training and technical assistance, quality improvement, data analytics, and applied research and evaluation. Our innovative solutions and proven processes lead to better value and health for all.
Altarum is an equal opportunity employer that provides employment and opportunities to all qualified employees and applicants without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, or any other characteristic protected by applicable law.
$72k-98k yearly est. Auto-Apply 51d ago
Data Engineer- AWS/Snowflake
Egen 4.2
Remote job
Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. If this describes you, we want you on our team.
Want to learn more about life at Egen? Check out these resources in addition to the job description.
-> Meet Egen -> Life at Egen-> Culture and Values at Egen-> Career Development at Egen
NOTE: This is a 6-month contract.About the job:
Migrate data and analytics workloads from BigQuery to Snowflake
Support GCP to AWS data platform migration
Develop and optimize ETL/ELT pipelines using Python and SQL
Build analytics-ready datasets for reporting and dashboards
Support BI tools such as Looker, Amazon QuickSight, or Tableau
Ensure data quality, performance, and reliability
Collaborate with data architects, analytics, and DevOps teams
About you:
3-5 years of experience as a Data Engineer
Strong SQL skills (complex queries, optimization)
Strong Python experience for data processing
Experience with Snowflake
Experience with BigQuery
Cloud experience on GCP and/or AWS
Experience supporting BI tools (Looker, QuickSight, Tableau)
Nice to have:
Experience with data migration projects
Knowledge of dbt, Airflow, or similar orchestration tools
Experience in multi-cloud environments
Familiarity with data modeling and analytics use cases
EEO and Accommodations:
Egen is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Egen will also consider qualified applications with criminal histories, consistent with legal requirements. Egen welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
$83k-113k yearly est. Auto-Apply 22d ago
Big Data Engineer
Hexaware Technologies, Inc. 4.2
Remote job
JD: Experience with big data processing and distributed computing systems like Spark. • Implement ETL pipelines and data transformation processes. • Ensure data quality and integrity in all data processing workflows. • Troubleshoot and resolve issues related to PySpark applications and workflows.
• Understand source, dependencies and data flow from converted PySpark code.
• Strong programming skills in Python and SQL.
• Experience with big data technologies like Hadoop, Hive, and Kafka.
• Understanding of data warehousing concepts and relational databases like SQL.
• Demonstrate and document code lineage.
• Integrate PySpark code with frameworks such as Ingestion Framework, DataLens, etc.
, • Ensure compliance with data security, privacy regulations, and organizational standards.
• Knowledge of CI/CD pipelines and DevOps practices.
• Strong problem-solving and analytical skills.
• Excellent communication and leadership abilities.