Post job

Hadoop Developer remote jobs - 1,530 jobs

  • Full Stack Engineer - Data Engineer (Hybrid position)

    The Planet Group 4.1company rating

    Remote job

    Senior Full Stack/Data Engineering (C#/.Net, Angular, Azure Databricks, AI/ML) Pay rate: $50-55 per hour Employment status: W2 (We can only consider candidates that do not require sponsorship via W2 status with us - C2C H1B candidates cannot be considered right now Location: DFW area, must be onsite Tues, Wed, Thursday Contract: 6+ months (chance to convert but not guaranteed) Looking for a Senior Full Stack & Data Engineering Developer - 3-7 years experience range is ideal. This role spans scalable data pipeline development, ML workflow enablement, and full‑stack (.NET + Angular) engineering in Azure. Core Responsibilities Build and optimize data pipelines, ingestion, and transformation workflows (ADF, Databricks). Enable ML workflows: data prep, feature engineering, deployment support. Develop full‑stack applications (Angular front end + .NET APIs + SQL backend). Collaborate with data scientists, architects, and product teams on end‑to‑end solutions. Ensure data quality, performance, and security across platforms. Troubleshoot production issues and support continuous improvement. Contribute to architecture and technical decision‑making. Skill Mix Breakdown Angular / Front‑End: 25% .NET / API / Backend: 25% Azure + Data Engineering (ADF, Databricks, Azure SQL): 35% ML / AI Exposure: 15% Required Skills 3-5+ years with .NET, C#, APIs, and backend development. Strong experience with Entity Framework, SQL/T‑SQL, and Databricks. Proficient with Angular and Angular Material UI components. Experience in Agile, CI, TDD environments. Solid problem‑solving skills in distributed cloud systems. Azure cloud experience (highly preferred). Preferred Skills Experience building or optimizing machine learning models or AI workflows. Familiarity with NLP/statistical approaches for solutioning. Exposure to tools such as Python, R, SAS, SQL, MATLAB, Java. TypeScript experience.
    $50-55 hourly 15h ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Salesforce Commerce SR Developer/Tech Lead

    Business Centric Technology

    Remote job

    Are you passionate about building powerful B2B or B2C commerce solutions on Salesforce? We're looking for a Senior Salesforce Commerce Developer & Technical Lead with deep expertise in Salesforce B2B or B2C Commerce (CloudCraze or Lightning) to drive end-to-end solution delivery and lead technical excellence across our projects. This is a remote OR hybrid, direct-hire position based out of Irving, TX. COMP: Up to $165,000 depending on candidate's experience, etc. WHAT'S IN IT FOR YOU: Remote work schedule Medical, dental, and vision with employer contributions Healthcare FSA & telehealth options Paid vacation, sick leave & holidays Life & AD&D insurance, plus multiple 401(k) plans Ongoing learning and development programs Lead complex, high-visibility projects that drive real business results. Influence technical direction and mentor future talent. Work with cutting-edge Salesforce technologies in a collaborative, forward-thinking environment. WHAT YOU'LL DO: Architect, design, and lead the development of scalable Salesforce B2B Commerce solutions. Act as the go-to technical expert for Salesforce Commerce projects-guiding both clients and internal teams. Conduct code reviews and ensure development best practices across the board. Customize and enhance Salesforce Commerce features using Apex, LWC, Visualforce, and other platform tools. Design and implement integrations with ERPs, CRMs, and third-party apps via APIs. Configure complex business elements like product catalogs, pricing engines, and checkout workflows. Collaborate with business analysts, solution architects, and project managers to turn business needs into high-performing solutions. Take ownership of the full SDLC-from requirements gathering through deployment and post-launch support. Ensure deliverables meet rigorous standards for performance, security, and scalability. Guide junior developers and foster a culture of knowledge sharing and continuous learning. WHAT YOU'LL BRING: Bachelor's Degree in Computer Science, Management Information Systems, or a related field, or equivalent work experience. Must have a minimum of 5 years of Salesforce Commerce development experience (CloudCraze or Lightning B2B or B2C), of which at least two years in Salesforce Commerce Lightning. Proven experience leading technical teams and driving solution design. Deep knowledge of Apex, LWC, Visualforce, Salesforce APIs, SOQL/SOSL, and Salesforce customization. Hands-on experience with Salesforce configuration (objects, flows, automations). Strong grasp of B2B or B2C commerce fundamentals-product catalogs, carts, pricing, orders, and checkout. Some headless full-stack development experience appreciated (ie, Angular, React, Node.js, etc.) Integration experience with external systems (ERP, CRM, etc.). BONUS POINTS FOR EXPERIENCE IN: Salesforce Certifications: B2B Commerce Developer, Platform Developer II, Application Architect, or CTA track. Familiarity with CI/CD tools like Git, Jenkins, or Copado. Experience working in Agile/Scrum environments. Front-end dev skills (JavaScript, HTML, CSS). Apply Today! CP # 8495
    $165k yearly 3d ago
  • Remote SDR Growth Leader | Scale Global Sales Development

    Influxdata 4.3company rating

    Remote job

    A leading technology company is seeking an experienced SDR Leader to manage and grow their Sales Development team. This position involves developing strategies to meet sales goals, fostering a high-performance culture, and supporting SDRs in their professional growth. Candidates should have 3 to 6 years of experience in sales development, with at least 3 years in a leadership role. The company offers competitive benefits including medical insurance, flexible time off, and a supportive work environment. #J-18808-Ljbffr
    $124k-176k yearly est. 5d ago
  • Oncology Statistics Lead - Clinical Development (Hybrid)

    Allergan 4.8company rating

    Remote job

    A global biopharmaceutical company in San Francisco seeks an Associate Director, Statistics for Oncology. This role provides statistical leadership for clinical development and life-cycle management, requiring significant experience in statistics or biostatistics. The ideal candidate will have over 10 years in the field, strong leadership abilities, and excellent communication skills. The position offers a hybrid work schedule and a competitive benefits package. #J-18808-Ljbffr
    $126k-162k yearly est. 1d ago
  • Remote Database developer

    Lockheed Martin 4.8company rating

    Remote job

    Database developer to support front end systems (as needed by developers across the organization, in support of web services, third party, or internal development needs) to the exclusion of reporting needs by other departments. Developed code includes but is not limited to PL/SQL in the form of Triggers, Procedures, Functions, & Materialized Views. Generates custom driven applications for intra-department use for business users in a rapid application development platform (primarily APEX). Responsible for functional testing and deployment of code through the development life cycle. Works with end-users to obtain business requirements. Responsible for developing, testing, improving, and maintaining new and existing processes to help users retrieve data effectively. Collaborates with administrators and business users to provide technical support and identify new requirements. Responsibilities Responsibilities: Design stable, reliable and effective database processes. Solve database usage issues and malfunctions. Gather user requirements and identify new features. Provide data management support to users. Ensure all database programs meet company and performance requirements. Research and suggest new database products, services, and protocols. Requirements and skills In-depth understanding of data management (e.g. permissions, security, and monitoring) Excellent analytical and organization skills An ability to understand front-end user requirements and a problem-solving attitude Excellent verbal and written communication skills Assumes responsibility for related duties as required or assigned. Stays informed regarding current computer technologies and relational database management systems with related business trends and developments. Consults with respective IT management in analyzing business functions and management needs and seeks new and more effective solutions. Seeks out new systems and software that reduces processing time and/or provides better information availability and decision-making capability. Job Type: Full-time Pay: From $115,000- 128,000 yearly Expected hours: 40 per week Benefits: Dental insurance Health insurance Paid time off Vision insurance Paid time off (PTO) Various health insurance options & wellness plans Required Knowledge Considerable knowledge of on-line and design of computer applications. Require Experience One to three years of database development/administration experience. Skills/Abilities Strong creative and analytical thinking skills. Well organized with strong project management skills. Good interpersonal and supervisory abilities. Ability to train and provide aid others.
    $115k-128k yearly 60d+ ago
  • Senior Data Engineer

    Roo 3.8company rating

    Remote job

    What We Do We're on a mission to empower animal healthcare professionals with opportunities to earn more and achieve greater flexibility in their careers and personal lives. Powered by groundbreaking technology, Roo has built the industry-leading veterinary staffing platform, connecting Veterinarians, Technicians, and Assistants with animal hospitals for relief work and hiring opportunities. Roo empowers the largest network of over 20,000 veterinary professionals to help more than 9,000 animal hospitals provide quality care to more pets. Together, we've provided more than 3 million hours of healthcare, helping Veterinarians earn more than $200 million.About the Role Data is at the core of what we do at Roo. Our growing data team is tight-knit, essential, and recognized for the high-quality, innovative work we deliver. You will be right at the center of this work as we build and maintain the systems that power Roo's most impactful initiatives. This role will push you to use every part of your data and analytics engineering skill set to develop an extensible data ecosystem that serves humans, machine learning models, and internal AI agents alike. Your Responsibilities Data Pipelines & Integrations: Design, develop, and maintain reliable end-to-end data pipelines (both batch and streaming) that connect internal and external systems in ways that best support marketplace growth, customer experience, and operational efficiency. Data Storage, Warehousing & Database Support: Contribute to the performance, scalability, and reliability of our entire data ecosystem. Cultivate our dbt/Snowflake environment, develop and maintain our data-centric AWS assets, and partner with product engineers to support the health and efficiency of our transactional databases. Data Transformation & Analytics Support (dbt): Work with analysts and other data stakeholders to engineer data structures and orchestrate workflows that encode core business logic. Produce clean, well-structured datasets that underpin traditional reporting, analyst experimentation, and ML and agentic AI use cases. Data Quality, Governance & Metric Trust: Implement observability, testing, monitoring, validation, and documentation to ensure accuracy, stability, and consistency throughout the data stack. Help shape shared definitions, metrics, and data semantics across the company. Business Collaboration & Insight Enablement: Join cross-functional squads and tiger teams to rapidly translate evolving data needs into scalable and extensible data models, metrics, and analytical frameworks. You will favor iterative delivery over one-shot solutions to support fast-moving OKRs and drive meaningful incremental progress week to week. Technical Expertise & Mentorship: Bring strong expertise in modern code quality, data modeling, and data stack patterns. Mentor data stakeholders throughout the organization, share best practices, and meaningfully contribute to architectural and tooling decisions as the data stack evolves. Qualifications Expert-level SQL and data modeling skills (5+ years of experience) Intermediate proficiency with data-centric Python packages and Node.js data interaction frameworks like Kysely, Prisma, and Sequelize Deep experience with Snowflake, dbt, MySQL, and AWS data services About You You care deeply about data quality, scalability, and clarity of purpose. You take pride in crafting systems that other engineers and analysts enjoy using and extending. You collaborate naturally with product engineers, data analysts, and business stakeholders, and you are comfortable translating ambiguity into clear technical plans. You are resilient and adaptable. You don't lose your footing when priorities shift, you work well with uncertainty and experimentation, and you make thoughtful decisions even when speed matters. You thrive in fast-growing environments and value iterative development. You know how to deliver impact quickly while still building toward a healthy, extensible stack. You bring experience across multiple business domains, such as product, marketing, sales, finance, and operations. You enjoy mentoring teammates, raising the technical bar, and contributing thoughtful perspectives to architectural decisions. You handle multiple simultaneous priorities well, communicate clearly, and maintain crisp expectation-setting with partners across the company. Exact compensation may vary based on skills, experience, and location. California pay range$170,000-$220,000 USDNew York pay range$170,000-$220,000 USDWashington pay range$180,000-$200,000 USDColorado pay range$145,000-$190,000 USDTexas pay range$145,000-$190,000 USDNorth Carolina pay range$135,000-$175,000 USD Core Values Our Core Values are what shape us as an organization and we're looking for people who exhibit the same values in their professional life; Bias to Urgency, Drive Measurable Impact, Seek Understanding, Solve Customer Problems and Have Fun! What to expect from working at Roo! For permanent, full time employees, we offer: Accelerated growth & learning potential. Stipends for home office setup, continuing education, and monthly wellness. Comprehensive health benefits to fit your needs with base medical plan covered at 100% with optional premium buy up plans. 401K Unlimited Paid Time Off. Paid Maternity/Paternity and reproductive care leave. Gifts on your birthday & anniversary. Opportunity for domestic travel, including for regional team building events. Overall, you would be part of a mission-driven company that will significantly empower the lives of all veterinary professionals and the health of the overall animal industry that seeks massive innovation. We have diverse, passionate & driven team members from a variety of backgrounds, and Roo is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status. We are committed to creating an inclusive environment for all employees and candidates. We understand that your individual experience may not check every box but we still encourage you to apply even if you are not confident in every expectation listed. Ready to join the Roo-volution?!
    $180k-200k yearly Auto-Apply 6d ago
  • Principal Data Platform Engineer

    Motive 4.3company rating

    Remote job

    Who we are: Motive empowers the people who run physical operations with tools to make their work safer, more productive, and more profitable. For the first time ever, safety, operations and finance teams can manage their drivers, vehicles, equipment, and fleet related spend in a single system. Combined with industry leading AI, the Motive platform gives you complete visibility and control, and significantly reduces manual workloads by automating and simplifying tasks. Motive serves nearly 100,000 customers - from Fortune 500 enterprises to small businesses - across a wide range of industries, including transportation and logistics, construction, energy, field service, manufacturing, agriculture, food and beverage, retail, and the public sector. Visit gomotive.com to learn more. About the Job: As a Principal Data Platform Engineer, you will be involved and responsible for full ownership and driving key data platform initiatives and life cycle of the data management including data ingestion, data processing, data storage, querying system, cost reduction efforts towards delivering product features to internal and external Motive customers. We are looking for a technical leader in the Data Platform area who has built the full data ingestion, transformation and analytics systems over AWS and Kubernetes in multiple companies. This individual would have faced and solved multiple challenges affecting many feature areas during this process using the best practices. This individual will be responsible for contributing significantly to drive the Motive's Data Platform vision. The Data Platform team works in three areas: 1. Build scalable systems and services for data ingestion, access, processing and query to enable data driven product features. 2. Collaborate and work closely with the various stakeholders and our backend product teams to improve and add features to the platform. *This role is open to candidates in Central & East Coast time zones. What You'll Do: Work with other leaders in the Platform area to define and plan out the long term strategy for Data Platform. Design and develop scalable distributed systems and frameworks for data management Focus on addressing fault-tolerance and high availability issues, and work on scaling ingestion pipelines, improving and adding features to ETL framework while maintaining SLAs on performance, reliability, and system availability. Collaborate with engineers across teams to identify and deliver cross-functional features Participate in all aspects of the software development life cycle, from design to implementation and delivery. What We're Looking For: 8+ years Hands-on software engineering experience Backend programming skills including multi-threading, concurrency, etc and proficient in one or more of Python Strong CS fundamentals including data structures, algorithms, and distributed systems Experience in designing, implementing, and operating highly scalable software systems and services Experience building systems using technologies like Apache Kafka, Apache Spark, Airflow, Kubernetes Excellent troubleshooting skills and track record of implementing creative solutions Hands on experience with containerized platforms like Docker and Kubernetes BS in Computer Science or a related field; Masters preferred Excellent verbal and written skills. You collaborate effectively with other teams and communicate clearly about your work. Pay Transparency Your compensation may be based on several factors, including education, work experience, and certifications. For certain roles, total compensation may include restricted stock units. Motive offers benefits including health, pharmacy, optical and dental care benefits, paid time off, sick time off, short term and long term disability coverage, life insurance as well as 401k contribution (all benefits are subject to eligibility requirements). Learn more about our benefits by visiting Motive Perks & Benefits. The compensation range for this position will depend on where you reside. Motive uses three geographic zones to determine pay range. For this role, the compensation ranges are: San Francisco, California$189,000-$236,000 USDU.S. metropolitan areas: Los Angeles, San Diego, New York City Area, Seattle, Washington D.C.$181,000-$226,000 USDOther locations in the United States$164,000-$205,000 USD Creating a diverse and inclusive workplace is one of Motive's core values. We are an equal opportunity employer and welcome people of different backgrounds, experiences, abilities and perspectives. Please review our Candidate Privacy Notice here. UK Candidate Privacy Notice here. The applicant must be authorized to receive and access those commodities and technologies controlled under U.S. Export Administration Regulations. It is Motive's policy to require that employees be authorized to receive access to Motive products and technology.
    $189k-236k yearly Auto-Apply 1d ago
  • Backend Developer - Database - USA(Remote)

    Photon Group 4.3company rating

    Remote job

    Greetings Everyone Who are we? For the past 20 years, we have powered many Digital Experiences for the Fortune 500. Since 1999, we have grown from a few people to more than 4000 team members across the globe that are engaged in various Digital Modernization. For a brief 1 minute video about us, you can check ***************************** What will you do? What are we looking for? Requirement is for a DB/BE Candidate with strong in SQL and PL/SQL skills Position Summary We are seeking a highly skilled backend-focused Staff Software Engineer to join our team. The ideal candidate will have extensive experience in backend development, system design, and a strong understanding of cloud-native software engineering principles. Responsibilities Develop backend services using Java and Spring Boot Design and implement solutions deployed on Google Cloud Platform (GKE) Work with distributed systems, including Google Cloud Spanner (Postgres dialect) and Confluent Kafka (or similar pub/sub tools) Design, optimize, and troubleshoot complex SQL queries and stored procedures (e.g., PL/SQL) to support high-performance data operations and ensure data integrity across applications. Collaborate with teams to implement CI/CD pipelines using GitHub Actions and Argo CD Ensure high performance and reliability through sound software engineering practices Mentor and provide technical leadership to the frontend engineering team Required Qualifications 7+ years' experience in software engineering from ideation to production deployment of IT solutions 5+ years' experience in full software development life cycle including ideation, coding, coding standards, testing, code reviews and production deployments 5+ years of experience with backend Java , Spring Boot and Microservices 3+ years of hands-on experience with a public cloud provider 3+ years working with pub/sub tools like Kafka or similar 3+ years of experience with database design/development (Postgres or similar) 2+ years of experience with CI/CD tools (GitHub Actions, Jenkins, Argo CD, or similar) Preferred Qualifications Demonstrated experience with development and deployment of Minimum Viable Products (MVPs) Must demonstrate innovative mindset, divergent thinking, and convergent actions. Familiarity with Kubernetes concepts; experience deploying services on GKE is a plus Compensation, Benefits and Duration Minimum Compensation: USD 44,000 Maximum Compensation: USD 154,000 Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role. Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees. This position is available for independent contractors No applications will be considered if received more than 120 days after the date of this post
    $80k-109k yearly est. Auto-Apply 43d ago
  • Database Developer 1 (Remote)

    Apidel Technologies 4.1company rating

    Remote job

    Prepares, defines, structures, develops, implements, and maintains database objects. Analyze query performance, identify bottlenecks, and implement optimization techniques. Defines and implements interfaces to ensure that various applications and user-installed or vendor-developed systems interact with the required database systems. Creates database structures, writing and testing SQL queries, and optimizing database performance. Plans and develops test data to validate new or modified database applications. Work with business analysts, and other stakeholders to understand requirements and integrate database solutions. Build and implement database systems that meet specific business requirements ensuring data integrity and security, as well as troubleshooting and resolving database issues. Design and implement ETL pipelines to integrate data from various sources using SSIS. Responsible for various SQL jobs. Skills Required Strong understanding of SQL and DBMS like MySQL, PostgreSQL, or Oracle. Ability to design and model relational databases effectively. Skills in writing and optimizing SQL queries for performance. Ability to troubleshoot and resolve database-related issues. Ability to communicate technical information clearly and concisely to both technical and non-technical audiences. Ability to collaborate effectively with other developers and stakeholders. Strong ETL experience specifically with SSIS. Skills Preferred Azure experience is a plus .Net experience is a plus GITHub experience is a plus Experience Required 2 years of progressively responsible programming experience or an equivalent combination of training and experience. Education Required Bachelor`s degree in Information Technology or Computer Science or equivalent experience
    $94k-121k yearly est. 5d ago
  • IS Database Developer II

    Careoregon 4.5company rating

    Remote job

    --------------------------------------------------------------- The IS Database Developer II is responsible for developing and maintaining database and ETL processes, as well as recommending and partnering in the design and development of effective solutions in support of business strategies. This role is essential toward maturing CareOregon's database and ETL development model. This position spends substantial time evaluating, architecting, and implementing IS priorities (plan, design, install, and maintain). Estimated Hiring Range: $111,690.00 - $136,510.00 Bonus Target: Bonus - SIP Target, 5% Annual Current CareOregon Employees: Please use the internal Workday site to submit an application for this job. --------------------------------------------------------------- Essential Responsibilities Database Development Actively participate in the design of custom databases and processes. Provide advanced database design support to the organization; lead small projects with assistance from Supervisor or Lead and participate and consult on other projects. Collaborate with other IS teams on best practices of database design and development. ETL Development Develop ETL processes for moderate to advanced activities. Develop moderate to advanced databases to meet application and web needs. Analyze business requirements; research and recommend solutions which include potential risks and mitigation. Develop and maintain appropriate technology documentation, including current design and operation. Standards and Policy Administration Propose requirements, standards and best practices for database and ETL development. Participate in the ongoing review of existing systems to ensure they are designed to comply with established standards and to empower business operations. Vendor Coordination and Relations Conduct product and vendor research, and present recommendations to more advanced database developers and/or management. Establish and maintain effective working relationships with vendors and related equipment suppliers, including installation and repair of services. Experience and/or Education Required Minimum 3 years of database and ETL development required. Experience should include some or all of the following: Database development and maintenance ETL development and maintenance Systems analysis and design Agile/Scrum methodology Note: For data warehouse focused roles, minimum 3 years' experience developing ETL for loading a dimensional model using a combination of T-SQL and SSIS 2012, 2014, or 2016 Preferred Bachelor's degree in Computer Science, Information Systems, or a related field Additional experience in related technology support and/or operational positions QNXT experience Knowledge, Skills and Abilities Required Knowledge Working knowledge/skills with the following: Microsoft SQL Server ETL tools, such as SSIS or Informatica Visual Studio Unit and integration testing Note: For data warehouse focused roles, advanced knowledge/skills of the dimensional model required in lieu of knowledge/skill requirements above General knowledge of BizTalk (preferred) Skills and Abilities Advanced abilities in troubleshooting system performance issues and root cause Effective communication skills, including listening, verbal, written, and customer service Ability to clearly articulate policies and instructions Demonstrated progress in conveying appropriate level of detail effectively to all levels of the organization including non-technical staff Ability to recommend policies, document risks, and propose solutions to information technology management and senior leadership Possess a high degree of initiative and motivation Ability to effectively collaborate with coworkers, staff, and leaders across all departments Ability to work effectively with diverse individuals and groups Ability to learn, focus, understand, and evaluate information and determine appropriate actions Ability to accept direction and feedback, as well as tolerate and manage stress Ability to see, read, and perform repetitive finger and wrist movement for at least 6 hours/day Ability to hear and speak clearly for at least 3-6 hours/day Working Conditions Work Environment(s): ☒ Indoor/Office ☐ Community ☐ Facilities/Security ☐ Outdoor Exposure Member/Patient Facing: ☒ No ☐ Telephonic ☐ In Person Hazards: May include, but not limited to, physical and ergonomic hazards. Equipment: General office equipment Travel: May include occasional required or optional travel outside of the workplace; the employee's personal vehicle, local transit or other means of transportation may be used. Work Location: Work from home We offer a strong Total Rewards Program. This includes competitive pay, bonus opportunity, and a comprehensive benefits package. Eligibility for bonuses and benefits is dependent on factors such as the position type and the number of scheduled weekly hours. Benefits-eligible employees qualify for benefits beginning on the first of the month on or after their start date. CareOregon offers medical, dental, vision, life, AD&D, and disability insurance, as well as health savings account, flexible spending account(s), lifestyle spending account, employee assistance program, wellness program, discounts, and multiple supplemental benefits (e.g., voluntary life, critical illness, accident, hospital indemnity, identity theft protection, pre-tax parking, pet insurance, 529 College Savings, etc.). We also offer a strong retirement plan with employer contributions. Benefits-eligible employees accrue PTO and Paid State Sick Time based on hours worked/scheduled hours and the primary work state. Employees may also receive paid holidays, volunteer time, jury duty, bereavement leave, and more, depending on eligibility. Non-benefits eligible employees can enjoy 401(k) contributions, Paid State Sick Time, wellness and employee assistance program benefits, and other perks. Please contact your recruiter for more information. We are an equal opportunity employer CareOregon is an equal opportunity employer. The organization selects the best individual for the job based upon job related qualifications, regardless of race, color, religion, sexual orientation, national origin, gender, gender identity, gender expression, genetic information, age, veteran status, ancestry, marital status or disability. The organization will make a reasonable accommodation to known physical or mental limitations of a qualified applicant or employee with a disability unless the accommodation will impose an undue hardship on the operation of our organization.
    $111.7k-136.5k yearly Auto-Apply 29d ago
  • Database Developer

    Oddball 3.9company rating

    Remote job

    Oddball believes that the best products are built when companies understand and value the things they are working on. We value learning and growth and the ability to make a big impact at a small company. We believe that we can make big changes happen and improve the daily lives of millions of people by bringing quality software to the federal space. We are seeking a Database Developer to design, build, and maintain secure, scalable data pipelines that enable effective use of enterprise data. In this role, you'll collaborate with engineers, analysts, and data stewards to deliver reliable datasets and models that support analytics, reporting, and decision-making. What You'll Be Doing You'll design, build, and maintain database and data integration solutions that support enterprise intelligence and data delivery efforts. This includes developing and optimizing database structures, implementing ETL pipelines, and supporting data ingestion and transformation across multiple systems. You'll help ensure data is delivered accurately, efficiently, and securely to downstream users and platforms, while supporting integration efforts across Military Health, readiness, and federal health data sources. What you'll bring: Experience developing databases and/or ETL pipelines in enterprise environments Strong SQL skills and familiarity with data modeling concepts Experience integrating data across disparate systems and formats Understanding of data lifecycle management and performance optimization Ability to collaborate with analysts, platform teams, and stakeholders Comfort working in structured delivery environments with defined methodologies Exposure to machine learning pipelines or advanced analytics integration. Prior experience supporting DHA or other federal healthcare programs. Performs other related duties as assigned. Requirements: Applicants must be authorized to work in the United States. In alignment with federal contract requirements, certain roles may also require U.S. citizenship and the ability to obtain and maintain a federal background investigation and/or a security clearance. Education: Bachelor's Degree Benefits: Fully remote Tech & Education Stipend Comprehensive Benefits Package Company Match 401(k) plan Flexible PTO, Paid Holidays Oddball is an Equal Opportunity Employer and does not discriminate against applicants based on race, religion, color, disability, medical condition, legally protected genetic information, national origin, gender, sexual orientation, marital status, gender identity or expression, sex (including pregnancy, childbirth or related medical conditions), age, veteran status or other legally protected characteristics. Any applicant with a mental or physical disability who requires an accommodation during the application process should contact an Oddball HR representative to request such an accommodation by emailing ************* Compensation: At Oddball, it's important each employee is compensated competitively and fairly. In alignment with state legal requirements. A range for the included position is listed below. Be advised, actual offer details are determined by job category, job location, and candidate skill level. United States Wage Range: $90,000 - $130,000
    $90k-130k yearly Auto-Apply 13d ago
  • PostgreSQL Database Developer

    Contact Government Services, LLC

    Remote job

    PostgreSQL Database DeveloperEmployment Type: Full Time, Experienced level Department: Information Technology CGS is seeking a PostgreSQL Database Developer to join our team supporting a rapidly growing Data Analytics and Business Intelligence platform focused on providing data solutions that empower our federal customers. You will support a migration from the current Oracle database to a Postgres database and manage the database environments proactively. As we continue our growth, you will play a key role in ensuring scalability of our data systems. CGS brings motivated, highly skilled, and creative people together to solve the government's most dynamic problems with cutting-edge technology. To carry out our mission, we are seeking candidates who are excited to contribute to government innovation, appreciate collaboration, and can anticipate the needs of others. Here at CGS, we offer an environment in which our employees feel supported, and we encourage professional growth through various learning opportunities. Skills and attributes for success:- Drive efforts to migrate from the current Oracle database to the new Microsoft Azure Postgres database- Create and maintain technical documentation, using defined technical documentation templates, as well as gain an in-depth knowledge of the business data to propose and implement effective solutions- Collaborate with internal and external parties to transform high-level technical objectives into comprehensive technical requirements- Ensure the availability and performance of the databases that support our systems, ensuring that they have sufficient resources allocated to support high resilience and speed.- Perform and assist developers in performance tuning- Proactively monitor the database systems to ensure secure services with minimum downtime and improve maintenance of the databases to include rollouts, patching, and upgrades- Create and maintain technical documentation using defined technical documentation templates, as well as gaining an in-depth knowledge of the business data to propose and implement effective solutions- Work within a structured and Agile development approach Qualifications:- Bachelor's degree- Must be US Citizenship- 7 years of experience with administrating PostgreSQL Databases in Linux environments- Experience with setting up, monitoring, and maintaining PostgreSQL instances- Experience with implementing and maintaining PostgreSQL backup and disaster recovery processes- Experience migrating Oracle schema, packages, views, triggers to Postgres using Ora2Pg tool Ideally, you will also have:- Experience implementing and maintaining data warehouses- Experience with AWS RDS for PostgreSQL- Experience with Oracle databases- Experience leveraging the Ora2Pg tool- Experience with working in cloud environments such as Azure and/or AWS- Prior federal consulting experience Our Commitment:Contact Government Services (CGS) strives to simplify and enhance government bureaucracy through the optimization of human, technical, and financial resources. We combine cutting-edge technology with world-class personnel to deliver customized solutions that fit our client's specific needs. We are committed to solving the most challenging and dynamic problems. For the past seven years, we've been growing our government-contracting portfolio, and along the way, we've created valuable partnerships by demonstrating a commitment to honesty, professionalism, and quality work. Here at CGS we value honesty through hard work and self-awareness, professionalism in all we do, and to deliver the best quality to our consumers mending those relations for years to come. We care about our employees. Therefore, we offer a comprehensive benefits package.- Health, Dental, and Vision- Life Insurance- 401k- Flexible Spending Account (Health, Dependent Care, and Commuter)- Paid Time Off and Observance of State/Federal Holidays Contact Government Services, LLC is an Equal Opportunity Employer. Applicants will be considered without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Join our team and become part of government innovation! Explore additional job opportunities with CGS on our Job Board:************************************* For more information about CGS please visit: ************************** or contact:Email: ******************* #CJ
    $78k-104k yearly est. Auto-Apply 60d+ ago
  • Lead Data Engineer & Modeler, AI - Hybrid

    Bigcommerce 4.8company rating

    Remote job

    Welcome to the Agentic Commerce Era At Commerce, our mission is to empower businesses to innovate, grow, and thrive with our open, AI-driven commerce ecosystem. As the parent company of BigCommerce, Feedonomics, and Makeswift, we connect the tools and systems that power growth, enabling businesses to unlock the full potential of their data, deliver seamless and personalized experiences across every channel, and adapt swiftly to an ever-changing market. Simply said, we help businesses confidently solve complex commerce challenges so they can build smarter, adapt faster, and grow on their own terms. If you want to be part of a team of bold builders, sharp thinkers, and technical trailblazers, working together to shape the future of commerce, this is the place for you. BigCommerce is building the foundation for the next generation of AI-driven commerce. As a Lead AI Engineer, Platform & Infrastructure, you'll define and scale the systems that make this transformation possible. This role sits at the intersection of data engineering, MLOps, and applied AI enablement, responsible for building the secure, scalable, and high-performance infrastructure that supports AI/ML use cases across the company. You'll collaborate across product, engineering, and data teams to design the unified AI platform layer - powering internal intelligence, customer-facing AI features, and advanced analytics. From model lifecycle management to data pipelines and inference infrastructure, you'll drive the architecture and operational excellence that allows BigCommerce to experiment, deploy, and iterate AI at scale. If you're passionate about enabling intelligence through infrastructure, designing modern ML ecosystems, and operationalizing AI across a fast-scaling SaaS platform, this role will put you at the center of BigCommerce's AI evolution. What You'll Do AI Platform Architecture Partner with the Enterprise Architect and Principal Data Architect to design the company-wide AI/ML platform strategy across GCP and AWS. Build scalable systems for model training, evaluation, deployment, and monitoring. Define best practices for data ingestion, feature stores, vector databases, and model registries. Integrate AI workflows into existing analytics and product pipelines. Infrastructure & Reliability Implement CI/CD for ML pipelines (MLOps) including model versioning, validation, and automated deployment. Ensure platform reliability, observability, and performance at enterprise scale. Manage GPU/TPU resources and optimize compute efficiency for training and inference workloads. Contribute to cost-optimization and security best practices across the AI infrastructure. Cross-Functional Collaboration Partner with data scientists, applied ML engineers, and product teams to translate model requirements into scalable architecture. Work closely with the data engineering team to ensure AI pipelines align with governance and data quality standards. Collaborate with software engineers to integrate AI services and APIs into production systems. Governance & Responsible AI Champion data and model governance, including lineage, reproducibility, and compliance (GDPR, SOC, ISO). Establish monitoring frameworks for model drift, bias detection, and ethical AI use. Build secure and transparent systems that support trust in AI-driven decisions. What You'll Bring 7+ years in data or ML engineering, with experience designing production-grade AI infrastructure. Strong technical foundation in MLOps, data pipelines, and distributed systems. Hands-on experience with: Cloud AI platforms (Vertex AI, SageMaker, Bedrock, or equivalent) Orchestration frameworks (Airflow, Kubeflow, MLflow, or Metaflow) Cloud data stacks (BigQuery, Snowflake, GCS/S3, Terraform) Model serving tools (FastAPI, BentoML, Ray Serve, or Triton Inference Server) Proficient in: Python, SQL, and Git-based CI/CD. Experience integrating LLMs and vector databases (e.g., Pinecone, FAISS, Weaviate, Vertex Matching Engine). Familiarity with Kubernetes, Docker, and Terraform for scalable deployment. Strong communication skills, able to partner across disciplines and simplify complex technical systems. What You'll Impact The AI foundation powering every intelligent capability within BigCommerce - from predictive analytics to generative assistants. The tools and frameworks that enable product and engineering teams to build, test, and ship AI faster. The reliability, governance, and scalability of BigCommerce's enterprise-wide AI ecosystem. Why Join Us You'll play a critical role in shaping how BigCommerce operationalizes AI - not just as a feature, but as a platform capability. You'll join a collaborative, ambitious, and fast-evolving data organization dedicated to creating systems that enable intelligence at scale. #LI-GL1 #LI-HYBRID (Pay Transparency Range: $116,000-$174,000) The exact salary will be dependent on the successful candidate's location, relevant knowledge, skills, and qualifications. Inclusion and Belonging At Commerce, we believe that celebrating the unique histories, perspectives and abilities of every employee makes a difference for our company, our customers and our community. We are an equal opportunity employer and the inclusive atmosphere we build together will make room for every person to contribute, grow and thrive. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the interview process, to perform essential job functions and to receive other benefits and privileges of employment. If you need an accommodation in order to interview at Commerce, please let us know during any of your interactions with our recruiting team. Learn more about the Commerce team, culture and benefits at ********************************* Protect Yourself Against Hiring Scams: Our Corporate Disclaimer Commerce, along with many other employers, has become the subject of fraudulent job offers to hopeful prospective job seekers. Be advised: Commerce does not offer jobs to individuals who do not go through our formal hiring process. Commerce will never: require payment of recruitment fees from candidates; request personally identifiable information through unsanctioned websites or applications; attempt to solicit money from you as part of the hiring process or as part of an employment offer; solicit money to complete visa requirements as part of a job offer. If you receive unsolicited offers of employment from Commerce, we urge you to be extremely cautious and avoid engaging or responding.
    $116k-174k yearly Auto-Apply 60d+ ago
  • Senior Data Engineer

    Rain 3.7company rating

    Remote job

    About the Company At Rain, we're rebuilding the global financial pipes money flows through. Our infrastructure makes stablecoins usable in the real world by powering credit card transactions, cross-border payments, B2B purchases, remittances, and more. We partner with fintechs, neobanks, and institutions to help them launch solutions that are global, inclusive, and efficient. If you're curious, bold, and excited to help shape a borderless financial system, we'd love to talk. Our Ethos Operating at the epicenter of stablecoin innovation means moving fast and thinking globally. Our team reflects the diverse, international audiences we serve. We hire people who stay agile as the tide ebbs and flows, fix what's broken without waiting, chase trends before they peak, and remember to have fun through it all. About the Role We're looking for Rain's first dedicated Data Engineer-a hands-on builder who will architect the ingestion, pipelines, and infrastructure that power our data ecosystem. As Rain scales to millions of end users across payments, card programs, and blockchain rails, your systems will ensure every team has access to timely, accurate, and trustworthy data. Reporting to the CTO and partnering closely with Analytics, Operations, Product, Compliance, and BD, you will own Rain's data pipelines end-to-end: from ingesting raw transaction and blockchain data, to orchestrating transformations, to ensuring quality, observability, and reliability. This role is foundational-you'll be shaping the data backbone that underpins analytics, customer reporting, operational tooling, and on-chain integrations. If you love building in fast-moving environments, care deeply about data quality, and want to own core infrastructure at the heart of a modern fintech, this role is for you. What you'll do Design, build, and maintain Rain's core data pipelines, including ingestion from payments processors, card issuers, blockchain nodes, internal services, and third-party APIs Own orchestration and workflow management, implementing Airflow, Dagster, or similar tools to ensure reliable, observable, and scalable data processing Architect and manage Rain's data warehouse (Snowflake, BigQuery, or Redshift), driving performance, cost optimization, partitioning, and access patterns Develop high-quality ELT/ETL transformations to structure raw logs, transactions, ledgers, and on-chain events into clean, production-grade datasets Implement data quality frameworks and observability (tests, data contracts, freshness checks, lineage) to ensure every dataset is trustworthy Partner closely with backend engineers to instrument new events, define data contracts, and improve telemetry across Rain's infrastructure Support Analytics and cross-functional teams by delivering well-modeled, well-documented tables that power dashboards, ROI analyses, customer reporting, and key business metrics Own data reliability at scale, leading root-cause investigations, reducing pipeline failures, and building monitoring and alerting systems Evaluate and integrate new tools across ingestion, enrichment, observability, and developer experience-raising the bar on performance and maintainability Help set the long-term technical direction for Rain's data platform as we scale across new products, regions, and chains What we're looking for Data infrastructure builder - You thrive in early-stage environments, owning pipelines and platforms end-to-end and choosing simplicity without sacrificing reliability Expert data engineer - Strong Python and SQL fundamentals, with real experience building production-grade ETL/ELT Workflow & orchestration fluent - Hands-on experience with Airflow, Dagster, Prefect, or similar systems Warehouse & modeling savvy - Comfortable designing schemas, optimizing performance, and operating modern cloud warehouses (Snowflake, BigQuery, Redshift) Quality-obsessed - You care deeply about data integrity, testing, lineage, and observability Systems thinker - You see data as a platform; you design for reliability, scale, and future users Collaborator - You work well with backend engineers, analytics engineers, and cross-functional stakeholders to define requirements and deliver outcomes Experienced - 5-7+ years in data engineering roles, ideally within fintech, payments, B2B SaaS, or infrastructure-heavy startups Nice to have, but not mandatory Experience ingesting and processing payment data, transaction logs, or ledger systems Exposure to smart contracts, blockchain data structures, or on-chain event ingestion Experience building data tooling for compliance, risk, or regulated environments Familiarity with dbt and/or semantic modeling to support analytics layers Prior experience standing up data platforms from 0→1 at early-stage companies Things that enable a fulfilling, healthy, and happy experience at Rain: Unlimited time off 🌴 Unlimited vacation can be daunting, so we require Rainmakers to take at least 10 days off. Flexible working ☕ We support a flexible workplace. If you feel comfortable at home, please work from home. If you'd like to work with others in an office, feel free to come in. We want everyone to be able to work in the environment in which they are their most confident and productive selves. New Rainmakers will receive a stipend to create a comfortable home environment. Easy to access benefits 🧠For US Rainmakers, we offer comprehensive health, dental and vision plans for you and your dependents, as well as a 100% company subsidized life insurance plan. Retirement goals💡Plan for the future with confidence. We offer a 401(k) with a 4% company match. Equity plan 📦 We offer every Rainmaker an equity option plan so we can all benefit from our success. Rain Cards 🌧️ We want Rainmakers to be knowledgeable about our core products and services. To support this mission, we issue a card for our team to use for testing. Health and Wellness 📚 High performance begins from within. Rainmakers are welcome to use their card for eligible health and wellness spending like gym memberships/fitness classes, massages, acupuncture - whatever recharges you! Team summits ✨ Summits play an important role at Rain! Time spent together helps us get to know each other, strengthen our relationships, and build a common destiny. Expect team and company off-sites both domestically and internationally.
    $102k-148k yearly est. Auto-Apply 5d ago
  • Data Engineer II

    Capital Rx 4.1company rating

    Remote job

    About Judi Health Judi Health is an enterprise health technology company providing a comprehensive suite of solutions for employers and health plans, including: Capital Rx, a public benefit corporation delivering full-service pharmacy benefit management (PBM) solutions to self-insured employers, Judi Health™, which offers full-service health benefit management solutions to employers, TPAs, and health plans, and Judi , the industry's leading proprietary Enterprise Health Platform (EHP), which consolidates all claim administration-related workflows in one scalable, secure platform. Together with our clients, we're rebuilding trust in healthcare in the U.S. and deploying the infrastructure we need for the care we deserve. To learn more, visit **************** Location: Remote (For Non-Local) or Hybrid (Local to NYC area or Denver, CO) Position Summary: We are seeking a highly motivated and talented Data Engineer to join our team and play a critical role in shaping the future of healthcare data management. This individual will be a key contributor in building robust, scalable, and accurate data systems that empower operational and analytics teams to make informed decisions and drive positive outcomes. Position Responsibilities: Lead relationship with operational and analytics teams to translate business needs into effective data solutions Architect and implement ETL workflows leveraging CapitalRx platforms and technologies such as Python, dbt, SQLAlchemy, Terraform, Airflow, Snowflake, and Redshift Conduct rigorous testing to ensure the flawless execution of data pipelines before production deployment Identify, recommend, and implement process improvement initiatives. Proactively identify and resolve data-related issues, ensuring system reliability and data integrity Lead moderately complex projects. Provide ongoing maintenance and support for critical data infrastructure, including 24x7 on-call availability Responsible for adherence to the Capital Rx Code of Conduct including reporting of noncompliance. Required Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field 2+ experience working with Airflow, dbt, and Snowflake Expertise in data warehousing architecture techniques and familiarity with Kimball methodology Minimum 3+ years experience with a proven track record as a Data Engineer, displaying the ability to design, implement, and maintain complex data pipelines 1+ year experience in Python, SQL Capacity to analyze the company's broader data landscape and architect scalable data solutions that support growth Excellent communication skills to collaborate effectively with both technical and non-technical stakeholders A self-motivated and detail-oriented individual with the ability to tackle and solve intricate technical challenges Preferred Qualifications: 1-3 years of experience as a Data Engineer, ideally in the healthcare or PBM sector Advanced proficiency with Airflow, dbt, and Snowflake, coupled with 3+ years of SQL development and Python experience This range represents the low and high end of the anticipated base salary range for the NY-based position. The actual base salary will depend on several factors such as experience, knowledge, and skills, and if the location of the job changes. This position description is designed to be flexible, allowing management the opportunity to assign or reassign duties and responsibilities as needed to best meet organizational goals. Salary Range$120,000-$140,000 USD All employees are responsible for adherence to the Capital Rx Code of Conduct including the reporting of non-compliance. This position description is designed to be flexible, allowing management the opportunity to assign or reassign duties and responsibilities as needed to best meet organizational goals. Judi Health values a diverse workplace and celebrates the diversity that each employee brings to the table. We are proud to provide equal employment opportunities to all employees and applicants for employment and prohibit discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, medical condition, genetic information, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. By submitting an application, you agree to the retention of your personal data for consideration for a future position at Judi Health. More details about Judi Health's privacy practices can be found at *********************************************
    $120k-140k yearly Auto-Apply 5d ago
  • Data Engineer (Remote, Continental United States)

    ICA.Ai 4.7company rating

    Remote job

    About ICA, Inc. International Consulting Associates, Inc. is a rapidly growing company, located in the D.C./Metro area. We were founded in 2009 to assist government clients with evaluating and achieving their objectives. We have become a trusted advisor helping our clients by offering cutting-edge innovation and solutions to complex projects. Our small company has grown significantly, and we're overjoyed at the opportunity to expand yet again! We are results-focused and have a proven track record supporting federal agencies and large government services primes in three main areas: Research and Data Analysis, Advanced-Data Science, and Strategic Services. We currently support multiple analytics and research programs across HHS. At ICA, we believe our success starts with our people. We foster a collaborative "one team" environment where work-life balance isn't just talked about - it's prioritized. We're building dynamic, highly skilled teams in a welcoming and supportive atmosphere. If you're passionate about using your technical expertise to make a difference, we want to talk to you. We are looking for a Data Engineer to join our growing team! ABOUT THE ROLE: We are seeking an experienced Data Engineer to build and maintain data infrastructure supporting ICA's federal agency clients, including the FDA. You'll develop pipelines and platforms that transform raw data into actionable insights, working with analysts, data scientists, and developers in an agile environment. ABOUT YOU: As a data engineer with software development expertise you bring a blend of analytical rigor and coding craftsmanship to every project. You excel at designing scalable data pipelines, optimizing performance, and ensuring data integrity across complex systems. Your strong programming skills allow you to build robust tools and services that empower data-driven decision-making. You collaborate seamlessly with cross-functional teams, translating business needs into technical solutions with clarity and precision. Above all, you are passionate about continuous learning and innovation, always seeking ways to improve systems and deliver value. RESPONSIBILITIES: Design and maintain scalable data pipelines and ETL/ELT processes Build document processing pipelines for text and image extraction Architect AWS-based data solutions using S3, Glue, Redshift, RDS, ECS, etc Optimize SQL queries and develop Python-based data processing workflows Troubleshoot data pipeline issues and implement solutions Ensure data pipeline performance, scalability, and security REQUIRED QUALIFICATIONS: 4+ years of experience working with ETL, Data Modeling, and Data Architecture Expertise in writing and optimizing SQL Experience with Big Data technologies such as Spark Intermediate Linux skills Experience in managing large data warehouses or data lakes Minimum of 1 year experience with programming in Python Experience with data and cloud engineering Knowledge of cloud computing services Bachelor's degree, or higher Ability to obtain a Public Trust Clearance PREFERRED QUALIFICATIONS: Databricks Lakehouse platform experience ML pipeline or graph algorithm implementation Unstructured data processing expertise BENEFITS: We invest in our team members so you can live your best life professionally and personally, offering a competitive salary and benefits. Health Insurance -100% employer-paid premiums - ICA covers the full cost of one of three offered medical plans Dental Insurance Vision insurance Health Spending Account Flexible Spending Account Life and Disability insurance 401(k) plan with company match Paid Time Off (Vacation, Sick Leave and Holidays) Education and Professional Development Assistance Remote work from anywhere within the continental United States LOCATION & TELEWORK This is a remote position. Candidates residing in the DMV area preferred. ADDITIONAL INFORMATION: ICA is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender, gender identity or expression, national origin, genetics, disability status, protected veteran status, age, or any other characteristic protected by state, federal or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
    $87k-117k yearly est. Auto-Apply 60d+ ago
  • Principal Data Engineer - ML Platforms

    Altarum 4.5company rating

    Remote job

    Altarum | Data & AI Center of Excellence (CoE) Altarum is building the future of data and AI infrastructure for public health - and we're looking for a Principal Data Engineer - ML Platforms to help lead the way. In this cornerstone role, you will design, build, and operationalize the modern data and ML platform capabilities that power analytics, evaluation, AI modeling, and interoperability across all Altarum divisions. If you want to architect impactful systems, enable data science at scale, and help ensure public health and Medicaid programs operate with secure, explainable, and trustworthy AI - this role is for you. What You'll Work On This role blends deep engineering with applied ML enablement: ML Platform Engineering: modern lakehouse architecture, pipelines, MLOps lifecycle Applied ML enablement: risk scoring, forecasting, Medicaid analytics NLP/Generative AI support: RAG, vectorization, health communications Causal ML operationalization: evaluation modeling workflows Responsible/Trusted AI engineering: model cards, fairness, compliance Your work ensures that Altarum's public health and Medicaid programs run on secure, scalable, reusable, and explainable data and AI infrastructure. What You'll Do Platform Architecture & Delivery Design and operate modern, cloud-agnostic lakehouse architecture using object storage, SQL/ELT engines, and dbt. Build CI/CD pipelines for data, dbt, and model delivery (GitHub Actions, GitLab, Azure DevOps). Implement MLOps systems: MLflow (or equivalent), feature stores, model registry, drift detection, automated testing. Engineer solutions in AWS and AWS GovCloud today, with portability to Azure Gov or GCP. Use Infrastructure-as-Code (Terraform, CloudFormation, Bicep) to automate secure deployments. Pipelines & Interoperability Build scalable ingestion and normalization pipelines for healthcare and public health datasets, including: FHIR R4 / US Core (strongly preferred) HL7 v2 (strongly preferred) Medicaid/Medicare claims & encounters (strongly preferred) SDOH & geospatial data (preferred) Survey, mixed-methods, and qualitative data Create reusable connectors, dbt packages, and data contracts for cross-division use. Publish clean, conformed, metrics-ready tables for Analytics Engineering and BI teams. Support Population Health in turning evaluation and statistical models into pipelines. Data Quality, Reliability & Cost Management Define SLOs and alerting; instrument lineage & metadata; ensure ≥95% of data tests pass. Perform performance and cost tuning (partitioning, storage tiers, autoscaling) with guardrails and dashboards. Applied ML Enablement Build production-grade pipelines for risk prediction, forecasting, cost/utilization models, and burden estimation. Develop ML-ready feature engineering workflows and support time-series/outbreak detection models. Integrate ML assets into standardized deployment workflows. Generative AI Enablement Build ingestion and vectorization pipelines for surveys, interviews, and unstructured text. Support RAG systems for synthesis, evaluation, and public health guidance. Enable Palladian Partners with secure, controlled-generation environments. Causal ML & Evaluation Engineering Translate R/Stata/SAS evaluation code into reusable pipelines. Build templates for causal inference workflows (DID, AIPW, CEM, synthetic controls). Support operationalization of ARA's applied research methods at scale. Responsible AI, Security & Compliance Implement Model Card Protocol (MCP) and fairness/explainability tooling (SHAP, LIME). Ensure compliance with HIPAA, 42 CFR Part 2, IRB/DUA constraints, and NIST AI RMF standards. Enforce privacy-by-design: tokenization, encryption, least-privilege IAM, and VPC isolation. Reuse, Shared-Services, and Enablement Develop runbooks, architecture diagrams, repo templates, and accelerator code. Pair with data scientists, analysts, and SMEs to build organizational capability. Provide technical guidance for proposals and client engagements. Your First 90 Days - You will make a meaningful impact fast. Expected outcomes include: Platform skeleton operational: repo templates, CI/CD, dbt project, MLflow registry, tests. Two pipelines in production (e.g., FHIR → analytics and claims normalization). One end-to-end CoE lighthouse MVP delivered (ingestion → model → metrics → BI). Completed playbooks for GovCloud deployment, identity/secrets, rollback, and cost control. Success Metrics (KPIs) Pipeline reliability meeting SLA/SLO targets. ≥95% data tests passing across pipelines. MVP dataset onboarding ≤ 4 weeks. Reuse of platform assets across ≥3 divisions. Cost optimization and budget adherence. What You'll Bring 7-10+ years in data engineering, ML platform engineering, or cloud data architecture. Expert in Python, SQL, dbt, and orchestration tools (Airflow, Glue, Step Functions). Deep experience with AWS + AWS GovCloud. CI/CD and IaC experience (Terraform, CloudFormation). Familiarity with MLOps tools (MLflow, Sagemaker, Azure ML, Vertex AI). Ability to operate in regulated environments (HIPAA, 42 CFR Part 2, IRB). Preferred: Experience with FHIR, HL7, Medicaid/Medicare claims, and/or SDOH datasets. Databricks, Snowflake, Redshift, Synapse. Event streaming (Kafka, Kinesis, Event Hubs). Feature store experience. Observability tooling (Grafana, Prometheus, OpenTelemetry). Experience optimizing BI datasets for Power BI. Logistical Requirements At this time, we will only accept candidates who are presently eligible to work in the United States and will not require sponsorship. Our organization requires that all work, for the duration of your employment, must be completed in the continental U.S. unless required by contract. If you're near one of our offices (Arlington, VA; Silver Spring, MD; or Novi, MI), you'll join us in person one day every other month (6 times per year) for a fun, purpose-driven Collaboration Day. These days are filled with creative energy, meaningful connection, and team brainstorming! Must be able to work during Eastern Time unless approved by your manager. Employees working remotely must have a dedicated, ergonomically appropriate workspace free from distractions with a mobile device that allows for productive and efficient conduct of business. Altarum is a nonprofit organization focused on improving the health of individuals with fewer financial resources and populations disenfranchised by the health care system. We work primarily on behalf of federal and state governments to design and implement solutions that achieve measurable results. We combine our expertise in public health and health care delivery with technology development and implementation, practice transformation, training and technical assistance, quality improvement, data analytics, and applied research and evaluation. Our innovative solutions and proven processes lead to better value and health for all. Altarum is an equal opportunity employer that provides employment and opportunities to all qualified employees and applicants without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, or any other characteristic protected by applicable law.
    $72k-98k yearly est. Auto-Apply 50d ago
  • Data Engineer- AWS/Snowflake

    Egen 4.2company rating

    Remote job

    Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. If this describes you, we want you on our team. Want to learn more about life at Egen? Check out these resources in addition to the job description. -> Meet Egen -> Life at Egen-> Culture and Values at Egen-> Career Development at Egen NOTE: This is a 6-month contract.About the job: Migrate data and analytics workloads from BigQuery to Snowflake Support GCP to AWS data platform migration Develop and optimize ETL/ELT pipelines using Python and SQL Build analytics-ready datasets for reporting and dashboards Support BI tools such as Looker, Amazon QuickSight, or Tableau Ensure data quality, performance, and reliability Collaborate with data architects, analytics, and DevOps teams About you: 3-5 years of experience as a Data Engineer Strong SQL skills (complex queries, optimization) Strong Python experience for data processing Experience with Snowflake Experience with BigQuery Cloud experience on GCP and/or AWS Experience supporting BI tools (Looker, QuickSight, Tableau) Nice to have: Experience with data migration projects Knowledge of dbt, Airflow, or similar orchestration tools Experience in multi-cloud environments Familiarity with data modeling and analytics use cases EEO and Accommodations: Egen is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Egen will also consider qualified applications with criminal histories, consistent with legal requirements. Egen welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
    $83k-113k yearly est. Auto-Apply 21d ago
  • Data Engineer I

    Cox Enterprises 4.4company rating

    Remote job

    Company Cox Automotive - USA Job Family Group Engineering / Product Development Job Profile Data Engineer I Management Level Individual Contributor Flexible Work Option Hybrid - Ability to work remotely part of the week Travel % Yes, 5% of the time Work Shift Day Compensation Compensation includes a base salary of $73,800.00 - $110,800.00. The base salary may vary within the anticipated base pay range based on factors such as the ultimate location of the position and the selected candidate's knowledge, skills, and abilities. Position may be eligible for additional compensation that may include an incentive program. Job Description Cox Automotive seeks a Data Engineer to join AI & Data Delivery Platform organization in Austin, TX. Our Teams: Cox Automotive is the market-leading provider of technology services to the automotive industry. Our engineering organization takes pride in building innovative products that make our customers successful. We are passionate about building great software while having fun doing it. The result is an award-winning culture where everyone is approachable, ideas are judged on merits, and empowered people drive transformative technology. Job Summary: Are you passionate about data? Join Cox Automotive as a Data Engineer I! You'll design high-performance queries, work with diverse data sources, and drive data solutions. Collaborate with data delivery teams to build applications using Python and maintain data pipelines, models, and queries in cloud environments using technologies like Snowflake, NoSQL databases, ETL tools, and AWS services (ECS, Lambdas, SQS, DynamoDB, Step Functions, RDS). Contribute throughout the SDLC, from design to CI/CD operations. Make a significant impact in the automotive industry with your data engineering skills! WHAT YOU'LL DO Role Overview: * Collaborate with Product and Engineering to understand data needs and deliver high-quality data solutions. * Develop and manage scalable data pipelines and architectures. * Engage with your agile delivery team to build a culture of passion, trust, and creativity. * Learn and leverage the automotive digital marketing domain to change how the world buys, sells, and uses cars. WHO YOU ARE Required Experience: * Applicants must currently be authorized to work in the United States for any employer without current or future sponsorship. No OPT, CPT, STEM/OPT or visa sponsorship now or in future. * Required BA/BS or higher degree in a related discipline with a focus in Data Engineering or Software Engineering * 1 to 3 years of related professional work experience in Data Engineering * MUST live within a commutable distance to Cox Automotive's Austin TX office and work in a hybrid-onsite setting * Experience with cloud-based application development, support and DevOps (ideally AWS) * Object-oriented design experience, including applied use of design patterns. * Experience with programming languages such as Python * Ability to work independently, to design, develop, and deploy solutions, and to deliver projects on time with moderate direction. * Ability to collaborate with other developers to build the best solution for the customer. * Database development skills, including using database technologies and logical and physical data modeling. * Please include links to any side projects you've worked on in your Github profile Preferred AI-Related Skills: * Exposure to AI/ML workflows and data preparation for model training * Experience with AI-assisted development tools (e.g., GitHub Copilot) * Familiarity with vector databases, LLM integration, or prompt engineering * Understanding of MLOps practices and cloud-based AI infrastructure * Commitment to ethical AI practices and data governance Drug Testing To be employed in this role, you'll need to clear a pre-employment drug test. Cox Automotive does not currently administer a pre-employment drug test for marijuana for this position. However, we are a drug-free workplace, so the possession, use or being under the influence of drugs illegal under federal or state law during work hours, on company property and/or in company vehicles is prohibited. Benefits The Company offers eligible employees the flexibility to take as much vacation with pay as they deem consistent with their duties, the company's needs, and its obligations; seven paid holidays throughout the calendar year; and up to 160 hours of paid wellness annually for their own wellness or that of family members. Employees are also eligible for additional paid time off in the form of bereavement leave, time off to vote, jury duty leave, volunteer time off, military leave, and parental leave. About Us Through groundbreaking technology and a commitment to stellar experiences for drivers and dealers alike, Cox Automotive employees are transforming the way the world buys, owns, sells - or simply uses - cars. Cox Automotive employees get to work on iconic consumer brands like Autotrader and Kelley Blue Book and industry-leading dealer-facing companies like vAuto and Manheim, all while enjoying the people-centered atmosphere that is central to our life at Cox. Benefits of working at Cox may include health care insurance (medical, dental, vision), retirement planning (401(k)), and paid days off (sick leave, parental leave, flexible vacation/wellness days, and/or PTO). For more details on what benefits you may be offered, visit our benefits page. Cox is an Equal Employment Opportunity employer - All qualified applicants/employees will receive consideration for employment without regard to that individual's age, race, color, religion or creed, national origin or ancestry, sex (including pregnancy), sexual orientation, gender, gender identity, physical or mental disability, veteran status, genetic information, ethnicity, citizenship, or any other characteristic protected by law. Cox provides reasonable accommodations when requested by a qualified applicant or employee with disability, unless such accommodations would cause an undue hardship. Applicants must currently be authorized to work in the United States for any employer without current or future sponsorship. No OPT, CPT, STEM/OPT or visa sponsorship now or in future.
    $73.8k-110.8k yearly Auto-Apply 7d ago
  • Big Data Engineer

    Hexaware Technologies, Inc. 4.2company rating

    Remote job

    Job Description: 1. 3-5 years in Data platform engineering 2. Experience with CI/CD, laC(Terraform) and containerization with Docker/Kubernetes 3. Hands on experience building backend applications like APIs, Services etc 4. Proven track record in building scalable Data Engineering pipelines using Python, SQL, DBT Core/Cloud. 5. Experience working with MWAA (Airflow) or similar cloud based Data Engineering Orchestration Tool 6. Experience working with cloud ecosystems like AWS, Azure or GCP and modern data tools like Snowflake, Databricks etc. 7. Strong problem solving skills as well as ability to move in a fast pace environment is a plus.
    $78k-103k yearly est. Auto-Apply 56d ago

Learn more about hadoop developer jobs

Browse computer and mathematical jobs