Post job

Data Engineer jobs at Zelis

- 1081 jobs
  • NMC_000344 - SAN Engineer (US Citizen Only)

    New Millenium Consulting 3.7company rating

    Kansas City, MO jobs

    One of our clients in Kansas City, is urgently looking for a SAN Engineer. Scope: The customer, needs support services to reconfigure a Dell Unity SAN, including installation of new drives, RAID configuration, encryption, and connection to a VMware cluster as traditional datastores. Customer wants to treat them like traditional VmWare DataStores, not vSan. The array needs to have the new drives installed, raid configured and entire array needs to be encrypted. They have 6 SSD and 11 10,00 rpm traditional hard drives. Note from the client: There is currently 4 - 600GB drives in the unit. Have 6 - 960 GB drives ( p/n - SDFSU76EXB02T ) and 11 - 1.8TB drives ( p/n 1XJ233-031) that they need added to the storage unit and configured. They believe they wanted to attach the array to our 3 node ESXi cluster for allocation as a Vsan datastore or datastores. Must-to-Have skills: Experience in Installation and Configuring SAN Experience with Dell Unity SAN Expertise with DataStores VMWare Configuration experience is a Plus
    $73k-102k yearly est. 1d ago
  • Senior Full-Stack Golang Developer

    Altea Healthcare 3.4company rating

    Houston, TX jobs

    Job Title: Senior Full-Stack Golang Developer Company: Aarista/ Altea Healthcare IT Job Type: Full-Time Compensation Range: $110,000-$140,000 USD depending on experience Our mission is to improve outcomes for Chronic Care patients who are dependent on multiple daily medications. Our proprietary and vertically integrated EMR technology solutions enable providers to enhance medication adherence through improved access, owned physician network, information We are looking for a Senior Full-Stack Developer and Lead. This person will play a key role on the core development team that is working on supporting and building our next generation suite of products, Revenue Cycle Management system. As a member of our core development team, this person will contribute significantly to designing and implementing various product features. In addition to bringing their experience building using the Microsoft stack, this role will also require learning and implementing solutions using other technologies on an as needed basis. We are an exciting healthcare startup company, so we need someone that is agile since changes are expected. Your Role Support, design and develop RCM software covering the full stack Golang, React (TypeScript), Mongo DB, Azure data bricks and azure data lake Brainstorm with your team to conceptualize and build new features. Experience with the Azure-based infrastructure and help us to leverage cloud technologies to ensure we can scale in line with customer adoption. Partner with business analysts and other developers in order to fully understand product requirements and implement solutions which meet these requirements. Provide technical leadership including architecture design, coding, code review, practices and skills development. You You thrive in a team environment but can also work independently. You are passionate about using your technical knowledge and skills to solve real business problems and are motivated by understanding the value that your work adds. A self-starter that can manage their own workload and an ever-growing task list. A team player and leader. Problem solving of potential roadblocks which could potentially impact patient care, strategic, and technical goals of the business. Very proficient with server-side languages Golang Proficiency with front-end React, Typescript, Javascript Knowledge with Azure data bricks and data lake Workding knowledge of relational databases such as SQL Server, Azure SQL You are passionate about creating innovating and exciting new technology and want to provide end users with the best possible experience. Have experience with Software development Lifecycle (SDLC) including system requirements collection, architecture, design, development, testing, maintenance and enhancement across a variety of technologies. Skills Required Experience: Golang React Front End - Typescript and Javascript Mongo DB Azure data Azure data lake Mongo DB Solid web service: RESTful and SOAP Nice to have: MS SQL, Azure SQL (SQL Server) Data modeling, UML and Design Patterns Azure experience Job Types: Full-time Pay: $110,000-$140,000 USD depending on experience Schedule: Full Time
    $110k-140k yearly 3d ago
  • Staff Data Engineer- Data Architect

    Headspace 4.7company rating

    Remote

    About the Staff Data Engineer, Data Architect at Headspace: At Headspace, our mission is to transform mental healthcare to improve the health and happiness of the world. Core to this mission is our ability to responsibly and ethically leverage data to provide personalized care to each of our members, meeting them where they are on the mental health continuum. We're looking for an experienced Data Architect who can also operate hands-on as a Staff Data Engineer. You will design and evolve our domain-based Enterprise Data Model (EDM), lead Master Data Management (MDM) initiatives, and build production-grade data pipelines in Python / PySpark. The ideal candidate is equally comfortable whiteboarding conceptual models, building and reviewing ETL jobs, and coaching engineering teams on data architecture best practices. Location: We are currently hiring this role in San Francisco (hybrid), Los Angeles (remote), New York City (remote) and Seattle (remote). Candidates must permanently reside in the US full-time and be based in these cities. What you will do: Lead the Development of Scalable Data Infrastructure: Drive the architecture and implementation of cutting-edge py Spark data pipelines to ingest and transform diverse datasets into the organization's data lake in a fault-tolerant, robust system. Set Design Patterns: Drive the creation and enforcement of standard conventions in code, architecture, schema design, and table design. Architect World-Class Data Platforms: Design and lead the evolution of secure, compliant, and privacy-forward data warehousing platforms to support the unique demands of the healthcare industry. Strategic Collaboration for Business Insights: Partner with analytics, product, and engineering leaders to ensure the data ecosystem provides actionable and reliable insights into critical business metrics. Champion Data-Driven Leadership: Mentor other members of the DE and broader data team, particularly around dbt architecture and query performance., Foster a data-first culture that prioritizes excellence, innovation, and collaboration across teams. Influence Organizational Strategy: Act as a technical thought leader, shaping the company's data strategy and influencing cross-functional roadmaps with data-centric solutions. What you will bring: 7+ years in data engineering / architecture, with 2+ years leading EDM/MDM programs and a proven track record of leading high-impact initiatives at scale. Proven ability to create and maintain domain-based enterprise data models (canonical, hub-and-spoke, data-product-oriented). Deep expertise with data-modeling tools (erwin, ER/Studio, PowerDesigner, or equivalent) and modeling techniques (3NF, Dimensional, Data Vault, Anchor). Production experience writing performant Python and PySpark code on distributed compute (Spark 3+, Delta Lake). Strong SQL skills across columnar and relational engines (e.g., Snowflake, Redshift, Databricks SQL, Postgres). Solid grasp of data-governance practices: lineage, glossaries, PII/PHI controls, and data- quality frameworks. Ability to articulate architecture choices to both executive stakeholders and hands-on engineers. Deep experience designing and optimizing real-time and batch ETL pipelines (preferably within dbt), employing best practices for scalability and reliability. Systems thinker who can balance near-term delivery with long-term architecture vision. Comfortable in highly collaborative, agile environments; able to mentor cross-functional teams. Excellent written and verbal communication; able to translate complex data topics into plain language. Bias for automation, documentation, and continuous improvement. Nice-To-Haves: Hands-on with Databricks platform (Unity Catalog, Delta Live Tables, MLflow). dbt Core for transformation, tests, and metadata; dbt Semantic Layer experience is a plus. Exposure to event streaming (Kafka, EventHub) and CDC tools. Experience integrating with commercial MDM suites or building custom match-merge solutions. Familiarity with cloud data-platform services on AWS (Terraform). Background in data-privacy standards (GDPR, CCPA, HIPAA) and differential-privacy or tokenization techniques. Pay & Benefits: The anticipated new hire base salary range for this full-time position is $140,400-$224,250 + equity + benefits. Our salary ranges are based on the job, level, and location, and reflect the lowest to highest geographic markets where we are hiring for this role within the United States. Within this range, individual compensation is determined by a candidate's location as well as a range of factors including but not limited to: unique relevant experience, job-related skills, and education or training. Your recruiter will provide more details on the specific salary range for your location during the hiring process. At Headspace, base salary is but one component of our Total Rewards package. We're proud of our robust package inclusive of: base salary, stock awards, comprehensive healthcare coverage, monthly wellness stipend, retirement savings match, lifetime Headspace membership, generous parental leave, and more. Additional details about our Total Rewards package will be provided during the recruitment process. About Headspace Headspace exists to provide every person access to lifelong mental health support. We combine evidence-based content, clinical care, and innovative technology to help millions of members around the world get support that's effective, personalized, and truly accessible whenever and wherever they need it. At Headspace, our values aren't just what we believe, they're how we work, grow, and make an impact together. We live them daily: Make the Mission Matter, Iterate to Great, Own the Outcome, and Connect with Courage. These values shape our decisions, guide our collaborations, and define our culture. They're our shared commitment to building a more connected, human-centered team-one that's redefining how mental health care supports people today and for generations to come. Why You'll Love Working Here: A mission that matters-with impact you can see and feel A culture that's collaborative, inclusive, and grounded in our values The chance to shape what mental health care looks like next Competitive pay and benefits that support your whole self How we feel about Diversity, Equity, Inclusion and Belonging: Headspace is committed to bringing together humans from different backgrounds and perspectives, providing employees with a safe and welcoming work environment free of discrimination and harassment. We strive to create a diverse & inclusive environment where everyone can thrive, feel a sense of belonging, and do impactful work together. As an equal opportunity employer, we prohibit any unlawful discrimination against a job applicant on the basis of their race, color, religion, gender, gender identity, gender expression, sexual orientation, national origin, family or parental status, disability*, age, veteran status, or any other status protected by the laws or regulations in the locations where we operate. We respect the laws enforced by the EEOC and are dedicated to going above and beyond in fostering diversity across our workplace. *Applicants with disabilities may be entitled to reasonable accommodation under the terms of the Americans with Disabilities Act and certain state or local laws. A reasonable accommodation is a change in the way things are normally done which will ensure an equal employment opportunity without imposing undue hardship on Headspace. Please inform our Talent team by filling out this form if you need any assistance completing any forms or to otherwise participate in the application or interview process. Headspace participates in the E-Verify Program . Privacy Statement All member records are protected according to our Privacy Policy. Further, while employees of Headspace (formerly Ginger) cannot access Headspace products/services, they will be offered benefits according to the company's benefit plan. To ensure we are adhering to best practice and ethical guidelines in the field of mental health, we take care to avoid dual relationships. A dual relationship occurs when a mental health care provider has a second, significantly different relationship with their client in addition to the traditional client-therapist relationship-including, for example, a managerial relationship. As such, Headspace requests that individuals who have received coaching or clinical services at Headspace wait until their care with Headspace is complete before applying for a position. If someone with a Headspace account is hired for a position, please note their account will be deactivated and they will not be able to use Headspace services for the duration of their employment. Further, if Headspace cannot find a role that fails to resolve an ethical issue associated with a dual relationship, Headspace may need to take steps to ensure ethical obligations are being adhered to, including a delayed start date or a potential leave of absence. Such steps would be taken to protect both the former member, as well as any relevant individuals from their care team, from impairment, risk of exploitation, or harm. For how how we will use the personal information you provide as part of the application process, please see: ****************************************** #LI-Hybrid
    $140.4k-224.3k yearly Auto-Apply 20d ago
  • Staff Data Engineer - Healthcare Data Infrastructure

    Qualified Disability Specialists 3.9company rating

    Remote

    Transform healthcare with us. At Qualified Health, we're redefining what's possible with Generative AI in healthcare. Our infrastructure provides the guardrails for safe AI governance, healthcare-specific agent creation, and real-time algorithm monitoring-working alongside leading health systems to drive real change. This is more than just a job. It's an opportunity to build the future of AI in healthcare, solve complex challenges, and make a lasting impact on patient care. If you're ambitious, innovative, and ready to move fast, we'd love to have you on board. Join us in shaping the future of healthcare. Job Summary: Qualified Health is seeking exceptional data engineers to join our growing team. You'll work at the intersection of healthcare data, cloud infrastructure, and AI-building robust, scalable data pipelines that power our platform. This role requires deep technical expertise in data engineering fundamentals, with opportunities to contribute across customer integrations, data quality and transformation, and platform automation depending on your strengths and interests. You'll collaborate with health systems and internal teams to deliver reliable healthcare data solutions using cutting-edge technologies. Key Responsibilities: Design, implement, and maintain production-grade data pipelines that process complex healthcare data at scale Build ETL/ELT solutions using modern cloud technologies and distributed computing frameworks Work with healthcare data standards including FHIR, HL7v2, and EHR-specific formats Ensure data quality, reliability, and compliance through comprehensive validation and monitoring Collaborate with cross-functional teams, customers, and stakeholders to deliver data solutions Debug and optimize pipeline performance in production environments Contribute to technical architecture decisions and platform evolution Document implementations and communicate technical concepts to varied audiences Required Qualifications: 6+ years of data engineering and/or related experience building and deploying production data pipelines Healthcare data experience working with clinical data standards (FHIR, HL7v2, CDA, X12) or EHR systems Strong SQL proficiency with ability to write complex queries, perform data modeling, and optimize performance Python expertise with modern development practices and frameworks Cloud platform experience (Azure, AWS, or GCP) with hands-on use of core data services ETL/ELT implementation using modern tools such as Databricks, Data Factory, Airflow, dbt, or similar platforms Proven problem-solving ability to independently debug complex data issues and drive solutions Excellent communication skills with ability to explain technical concepts clearly and document work thoroughly Bachelor's degree in Computer Science, Engineering, or related field Desirable Skills: Healthcare & Integration Expertise Direct experience establishing clinical data feeds from Epic, Cerner, eClinicalWorks, or other major EHR systems Knowledge of EHR-specific data formats and integration mechanisms (Epic Clarity, Chronicles, etc.) Experience with healthcare integration patterns including FHIR APIs, SFTP, Delta Sharing, Fabric External Shares, Snowflake Reader Accounts Customer-facing experience managing technical implementations and timelines Understanding of HIPAA and HITRUST compliance requirements in data infrastructure Data Quality & Transformation Healthcare data science or analytics background with understanding of clinical data models Experience designing sophisticated, configurable transformation logic and validation frameworks Expertise in data profiling, quality assessment, and building rule-based quality engines Strong instincts for identifying data quality issues and patterns in complex healthcare datasets Experience building data pipelines for AI/ML applications Platform Engineering & Automation Advanced experience with PySpark and distributed computing frameworks Infrastructure as Code (Terraform, CloudFormation, Pulumi, or similar) CI/CD pipeline development using GitHub Actions, Azure DevOps, GitLab, or similar Experience building reusable data engineering abstractions, frameworks, and developer tooling Production monitoring, alerting, and observability solutions (DataDog, Grafana, Prometheus, or similar) Track record establishing operational excellence standards for data systems Technical Depth Deep expertise in Azure cloud services (Databricks, Data Factory, Fabric, ADLS2, networking) Experience with modern data warehouse solutions (Snowflake, Databricks, Fabric) Real-time data processing and streaming architectures Advanced Python patterns for type safety and modern development Relevant certifications (Azure Data Engineer, Databricks, Snowflake, etc.) Master's degree in Computer Science, Engineering, or related field Strong candidates will excel in multiple areas above, demonstrating both breadth and depth in their expertise. Technical Environment: Our data infrastructure is built on modern cloud technologies including: Azure Databricks + Data Factory (plus Fabric and Snowflake integrations) PySpark for distributed data processing GitHub Actions + Terraform for CI/CD and Infrastructure as Code Python with type-safe patterns and modern frameworks Impact & Growth Opportunity: As a Senior Data Engineer at Qualified Health, you'll build the data foundation that powers our AI platform's ability to deliver insights to major health systems. You'll have the opportunity to shape our technical architecture and directly impact healthcare delivery, working with cutting-edge technologies in a rapidly growing company. This position offers significant visibility and growth potential as we scale our platform across the healthcare ecosystem. Why Join Qualified Health? This is an opportunity to join a fast-growing company and a world-class team, that is poised to change the healthcare industry. We are a passionate, mission-driven team that is building a category-defining product. We are backed by premier investors and are looking for founding team members who are excited to do the best work of their careers. Our employees are integral to achieving our goals so we are proud to offer competitive salaries with equity packages, robust medical/dental/vision insurance, flexible working hours, hybrid work options and an inclusive environment that fosters creativity and innovation. Our Commitment to Diversity Qualified Health is an equal opportunity employer. We believe that a diverse and inclusive workplace is essential to our success, and we are committed to building a team that reflects the world we live in. We encourage applications from all qualified individuals, regardless of race, color, religion, gender, sexual orientation, gender identity or expression, age, national origin, marital status, disability, or veteran status. Pay & Benefits: The pay range for this role is between $140,000 and $220,000, and will depend on your skills, qualifications, experience, and location. This role is also eligible for equity and benefits. Join our mission to revolutionize healthcare with AI. To apply, please send your resume through the application below.
    $140k-220k yearly Auto-Apply 60d+ ago
  • Forward Deployed Data Engineer

    Qventus 4.1company rating

    Remote

    On this journey for over 12 years, Qventus is leading the transformation of healthcare. We enable hospitals to focus on what matters most: patient care. Our innovative solutions harness the power of machine learning, generative AI, and behavioral science to deliver exceptional outcomes and empower care teams to anticipate and resolve issues before they arise. Our success in rapid scale across the globe is backed by some of the world's leading investors. At Qventus, you will have the opportunity to work with an exceptional, mission-driven team across the globe, and the ability to directly impact the lives of patients. We're inspired to work with healthcare leaders on our founding vision and unlock world-class medicine through world-class operations. #LI-JB1 The Role Forward Deployed Data Engineers at Qventus collaborate directly with clients to identify their most critical data challenges and design scalable, high-performance pipelines and architectures to solve them. Our customers depend on Qventus' data infrastructure for mission-critical healthcare operations, and projects often start with broad, high-impact questions like, “How can we unify real-time surgical, staffing, and patient flow data into a single source of truth?” or “What's the most efficient way to process and serve operational data for instant decision-making?” As a Data Engineer, you'll combine technical expertise in large-scale data systems with a deep understanding of operational needs to create solutions that bridge the gap between raw data and actionable insights. You'll work closely with data scientists, software engineers, and product teams to ensure our data pipelines are robust, efficient, and ready to support advanced analytics, AI models, and production-grade applications. You'll operate in small, agile teams with significant autonomy, taking projects from initial scoping and design through to deployment and ongoing optimization. A typical day might involve architecting cloud-based ETL workflows, optimizing query performance on multi-terabyte datasets, integrating disparate hospital data systems, or collaborating with client IT teams to ensure seamless adoption. Key Responsibilities Design, build, and maintain scalable data pipelines and architectures to support analytics, machine learning, and operational applications. Collaborate with cross-functional teams to translate complex operational needs into reliable, well-modeled datasets. Integrate and normalize data from multiple structured and unstructured healthcare sources (EHRs, scheduling systems, operational databases, etc.). Optimize query performance and data processing for speed, scalability, and cost efficiency. Implement best practices for data quality, governance, and security in compliance with healthcare regulations (e.g., HIPAA). Support deployment, monitoring, and troubleshooting of production data systems. What We're Looking For Proven experience as a data engineer or in a similar role, with a track record of building and maintaining large-scale data infrastructure. Strong proficiency in SQL and Python for data processing and pipeline development. Experience with cloud data platforms and services such as AWS (RDS, Redshift, Lambda, S3), GCP, or Azure. Knowledge of both relational and non-relational databases (PostgreSQL, MySQL, MongoDB, etc.). Familiarity with modern data workflow orchestration tools (Airflow, DBT, Dagster, etc.). Ability to work closely with both technical and non-technical stakeholders to gather requirements and deliver solutions. Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent experience. Bonus Points Experience working with healthcare data and integrating EHR, scheduling, or operational systems. Familiarity with real-time data processing frameworks (Kafka, Kinesis, Spark Streaming, Flink). Knowledge of data warehousing solutions like Snowflake or BigQuery. Hands-on experience with Databricks or similar data lakehouse platforms. Strong understanding of data privacy, compliance, and security in regulated environments. Experience mentoring peers or contributing to cross-functional technical strategy. Compensation for this role is based on market data and takes into account a variety of factors, including location, skills, qualifications, and prior relevant experience. Salary is just one part of the total rewards package at Qventus. We also offer a range of benefits and perks, including Open Paid Time Off, paid parental leave, professional development, wellness and technology stipends, a generous employee referral bonus, and employee stock option awards. Salary Range$140,000-$220,000 USD Qventus values diversity in its workforce and proudly upholds the principles of Equal Opportunity Employment . We welcome all qualified applicants and ensure fair consideration for employment without discrimination based on any legally protected characteristics, including, but not limited to: veteran status, uniformed service member status, race, color, religion, sex, sexual orientation, gender identity, age, pregnancy (including childbirth, lactation and related medical conditions), national origin or ancestry, citizenship or immigration status, physical or mental disability, genetic information (including testing and characteristics) or any other category protected by federal, state or local law (collectively, "protected characteristics"). Our commitment to equal opportunity employment applies to all persons involved in our operations and prohibits unlawful discrimination by any employee, including supervisors and co-workers. Qventus participates in the E-Verify program as required by law and is committed to providing reasonable accommodations to individuals with disabilities in compliance with Americans with Disabilities Act (ADA). In compliance with the California Consumer Privacy Act (CCPA), Qventus provides transparency into how applicant data is processed during the application process. Candidate information will be treated in accordance with our candidate privacy notice. *Benefits and perks are subject to plan documents and may change at the company's discretion. *Employment is contingent upon the satisfactory completion of our pre-employment background investigation and drug test.
    $140k-220k yearly Auto-Apply 60d+ ago
  • Data Engineer - Las Vegas, USA

    Photon Group 4.3company rating

    Remote

    Greetings Everyone Who are we? For the past 20 years, we have powered many Digital Experiences for the Fortune 500. Since 1999, we have grown from a few people to more than 4000 team members across the globe that are engaged in various Digital Modernization. For a brief 1 minute video about us, you can check ***************************** What will you do? What are we looking for? Job Description : Develop and maintain data pipelines, ELT processes, and workflow orchestration using Apache Airflow, Python and PySpark to ensure the efficient and reliable delivery of data. Design and implement custom connectors to facilitate the ingestion of diverse data sources into our platform, including structured and unstructured data from various document formats . Collaborate closely with cross-functional teams to gather requirements, understand data needs, and translate them into technical solutions. Implement DataOps principles and best practices to ensure robust data operations and efficient data delivery. Design and implement data CI/CD pipelines to enable automated and efficient data integration, transformation, and deployment processes. Monitor and troubleshoot data pipelines, proactively identifying and resolving issues related to data ingestion, transformation, and loading. Conduct data validation and testing to ensure the accuracy, consistency, and compliance of data. Stay up-to-date with emerging technologies and best practices in data engineering. Document data workflows, processes, and technical specifications to facilitate knowledge sharing and ensure data governance. Experience: Bachelor's degree in computer science, Engineering, or a related field 8 + years' experience in data engineering, ELT development, and data modeling. Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management. Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms. Demonstrated experience in developing custom connectors for data ingestion from various sources. Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance. Experience implementing DataOps principles and practices, including data CI/CD pipelines. Excellent problem-solving and troubleshooting skills, with a strong attention to detail. Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams. Familiarity with data visualization tools Apache Superset and dashboard development. Understanding of distributed systems and working with large-scale datasets. Familiarity with data governance frameworks and practices. Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka). Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes. Experience with Agile development methodologies and working in cross-functional Agile teams. Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment. Excellent analytical and problem-solving skills, with a keen attention to detail. Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders. Required Skills - Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum Good to have- Linux,Openshift, Kubernentes, Superset Compensation, Benefits and Duration Minimum Compensation: USD 33,000 Maximum Compensation: USD 118,000 Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role. Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees. This position is available for independent contractors No applications will be considered if received more than 120 days after the date of this post
    $95k-138k yearly est. Auto-Apply 11d ago
  • Sr Data Engineer - Tampa

    Photon Group 4.3company rating

    Remote

    Responsibilities Big Data - Spark, Hive, Java, CDP, 6 Years + with 3 Years of Dev experience on Big Data Analyze data requirements and identify disparate data sources required for consolidation and distribution. Document functional specifications and coordinate delivery of the same with technology team. Review logical and conceptual data models in alignment with business requirements. Work with the stakeholders to understand and gather requirements and produce business specifications Validate solution implementations and ensure they meet business and functional requirements. Provide production deployment support and investigate data quality issues. Work with various technology leads to ensure the gaps in the data completeness or accuracy are bridged. Qualifications Subject matter expertise in financial industry - wholesale loans /lending business OR Capital Markets or Finance or Risk Reporting Strong hands on experience with database and SQL is required. Excellent documentation, analytical skills to produce process flow diagrams, business modelling, and functional design. Proficiency in MS Office (Word, Excel, Visio, PowerPoint) with extensive experience using Excel for data analysis. Experience with Data tracing/ Lineage efforts Knowledge of logical and physical data model Compensation, Benefits and Duration Minimum Compensation: USD 42,000 Maximum Compensation: USD 148,000 Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role. Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees. This position is not available for independent contractors No applications will be considered if received more than 120 days after the date of this post
    $95k-138k yearly est. Auto-Apply 60d+ ago
  • Senior Data Engineer

    Brigham and Women's Hospital 4.6company rating

    Somerville, MA jobs

    Site: Mass General Brigham Incorporated Mass General Brigham relies on a wide range of professionals, including doctors, nurses, business people, tech experts, researchers, and systems analysts to advance our mission. As a not-for-profit, we support patient care, research, teaching, and community service, striving to provide exceptional care. We believe that high-performing teams drive groundbreaking medical discoveries and invite all applicants to join us and experience what it means to be part of Mass General Brigham. The Senior Data Engineer will help MGB Digital establish a data foundation for a suite of new analytic products that will be used across all PHSO domains. A solid data foundation will provide the backbone to enable data consistency and growth potential for these new products. This senior data engineer will help achieve this objective by delivering the underlying data assets and pipelines for the data foundation. Job Summary The Opportunity The Payer Data Integration team performs data and content management for the Payer source mart within the Enterprise Data Warehouse (EDW). The primary content in the Payer source mart is information about Mass General Brigham (MGB) patients that have primary care relationship with an MGB Provider. Health plans deliver monthly updates containing changes in patient demographics, enrollment status in the health plan's products, and associated medical and pharmacy claims. The Payer Data Integration team executes a suite of ETL/ELT jobs to perform data ingestion and staging data in the EDW. The team then executes extensive QC activities to evaluate and certify incoming data before promoting updates to Production. The Population Health Service Organization (PHSO) is the primary stakeholder for the Payer source mart and provides direction and prioritization for new and revised functionality. The Payer source mart is a critical data source for other EDW databases, enterprise reporting, program development, and general analytics performed by PHSO and other business organizations across MGB. The Payer Data Integration team employs high standards and best practices to deliver timely, quality content updates to support the needs of these consumers. The Payer Data Integration team leverages an Agile development methodology and is composed of two scrum teams. PHSO Team 1 supports the development and implementation of major PHSO business initiatives. The team develops and revises new EDW data assets and works with PHSO Team 2 for product installation and support. PHSO Team 2 is responsible for the monthly operation and general maintenance of the Payer source mart and other PHSO data assets and products. The Senior Data Engineer is responsible for designing, developing, and maintaining the data architecture and infrastructure within an organization. This position plays a crucial role in managing large-scale data systems and ensuring the efficient flow, storage, and accessibility of data for various stakeholders, such as data analysts, data scientists, and business users. Essential Functions * Design, develop, and implement data pipelines and ETL/ELT code to support business requirements. * Work on cross-functional teams delivering enterprise solutions for internal and external clients. * Assume ownership for delivering code revisions and enhancements from design through development and production installation. * Maintain and optimize various components of the data pipeline architecture. < • Become subject matter expert for internal and external data products Ensure design solutions can scale and meet technical standards and performance benchmarks. * Identify inefficient processes and develop recommendations and design solutions. * Lead code review sessions to validate technical solutions and facilitate knowledge sharing. Qualifications * Bachelor's Degree Related Field of Study required * We can consider and review experience in lieu of a degree * Experience in data engineering, with a focus on building and maintaining data infrastructure and pipelines * 5-7 years required and Data warehousing development in large reporting environments * 3-5 years required and Experience working with developing data pipelines using on Snowflake features ( Snowpipe, SnowSQL, Snow Sight, Data Streams ) required and Hands-on development experience with ETL/ELT tools, such as dbt, Fivetran, or Informatic required and Experience working in Agile software development environment required Knowledge, Skills and Abilities * Working knowledge of cloud computing platforms such as AWS, GCP, or Azure. * Experience with enterprise database solutions in cloud or on-premise environments Adherence to sound engineering principles and practices when designing technical solutions. Additional Job Details (if applicable) * M-F Eastern Business hours required Hybrid Onsite Flexible working model required for weekly onsite work at Assembly Row/ Local MGB sites * Business needs and team needs determine in office work * Remote working days require stable, secure, quiet, compliant work area Remote Type Hybrid Work Location 399 Revolution Drive Scheduled Weekly Hours 40 Employee Type Regular Work Shift Day (United States of America) Pay Range $92,102.40 - $134,056.00/Annual Grade 7 At Mass General Brigham, we believe in recognizing and rewarding the unique value each team member brings to our organization. Our approach to determining base pay is comprehensive, and any offer extended will take into account your skills, relevant experience if applicable, education, certifications and other essential factors. The base pay information provided offers an estimate based on the minimum job qualifications; however, it does not encompass all elements contributing to your total compensation package. In addition to competitive base pay, we offer comprehensive benefits, career advancement opportunities, differentials, premiums and bonuses as applicable and recognition programs designed to celebrate your contributions and support your professional growth. We invite you to apply, and our Talent Acquisition team will provide an overview of your potential compensation and benefits package. EEO Statement: Mass General Brigham Incorporated is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religious creed, national origin, sex, age, gender identity, disability, sexual orientation, military service, genetic information, and/or other status protected under law. We will ensure that all individuals with a disability are provided a reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. To ensure reasonable accommodation for individuals protected by Section 503 of the Rehabilitation Act of 1973, the Vietnam Veteran's Readjustment Act of 1974, and Title I of the Americans with Disabilities Act of 1990, applicants who require accommodation in the job application process may contact Human Resources at **************. Mass General Brigham Competency Framework At Mass General Brigham, our competency framework defines what effective leadership "looks like" by specifying which behaviors are most critical for successful performance at each job level. The framework is comprised of ten competencies (half People-Focused, half Performance-Focused) and are defined by observable and measurable skills and behaviors that contribute to workplace effectiveness and career success. These competencies are used to evaluate performance, make hiring decisions, identify development needs, mobilize employees across our system, and establish a strong talent pipeline.
    $92.1k-134.1k yearly Auto-Apply 2d ago
  • Principal Data Engineer

    Eye Care Partners 4.6company rating

    Ballwin, MO jobs

    We are seeking an experienced professional who will serve as the Principal Data Engineer on our Data Platforms & Insights team. The Principal Data Engineer serves as a senior technical leader within the Data Platforms & Insights team, responsible for architecting, developing, and maintaining scalable data solutions that support enterprise-wide analytics, reporting, and data management initiatives. This role drives the design and implementation of robust data pipelines, ensures data quality and governance, and enables self-service analytics through a "Data as a Service" model. The Principal Data Engineer collaborates closely with cross-functional teams, business stakeholders, and third-party service providers to deliver high-impact data solutions, while also mentoring and supervising Data Engineers to uphold engineering standards and best practices. ESSENTIAL DUTIES AND RESPONSIBILITIES * Design, develop, and maintain scalable and efficient data pipelines using ETL tools and programming languages * Develop integration solutions leveraging APIs to enable seamless communication between systems. * Analyze data elements from various systems, data flow, dependencies, relationships and assist in designing conceptual physical and logical data models * Implement data solutions across on-prem and cloud environments, ensuring performance, reliability, and scalability * Ensure all data pipelines follow established data governance rules for data quality and completeness * Maintain and evolve existing monitoring, logging, and alerting frameworks for proactively managing and troubleshooting data pipelines * Manage source code repositories and deployment processes using modern tools * Utilize Infrastructure as Code (IaC) tools to automate and manage infrastructure provisioning * Work within Agile development framework to understand and transform business requirements into scalable and manageable solutions * Work with various business and technical stakeholders and assist with data-related technical needs and issues * Partner with leadership to define and evolve the long-term data architecture and engineering strategy, ensuring alignment with business goals * Present solutions and options to leadership, project teams and other stakeholders adapting style to both technical and non-technical audiences * Establish and enforce documentation standards for data pipelines, schemas, and infrastructure * Ensures data engineers and other technical teams adhere to documented design and development patterns and standards * Conduct code reviews and provide guidance to other developers, fostering growth and development within the team * Proactively monitor and resolve on-going production issues data pipelines, databases, and infrastructure * Educate organization on latest trends and technologies in data engineering, APIs, and streaming data * Lead team on establishing industry best practices in data engineering to ensure high-quality deliverables * Adheres to all safety policies and procedures in performing job duties and responsibilities while supporting a culture of high quality and great customer service. * Performs other duties that may be necessary or in the best interest of the organization. QUALIFICATIONS * Demonstrated ability to work efficiently and effectively in a fast-paced, matrixed environment, and ability to execute despite ambiguity * Previous experience with a Healthcare company preferred * Enjoys learning new technologies and systems * Exhibits a positive attitude and is flexible in accepting work assignments and priorities * Interpersonal skills to support customer service, functional, and teammate support need * Knowledge of state and federal regulations for this position; general understanding of HIPAA guidelines SUPERVISORY RESPONSIBILITIES * Directly supervises Data Engineers on the Data Platforms & Insights team * Carries out supervisory responsibilities in accordance with the organization's policies and applicable laws. * Responsibilities include interviewing, hiring, and training employees, planning, assigning, and directing work; appraising performance, rewarding and disciplining employees, addressing complaints and resolving problems. EDUCATION AND/OR EXPERIENCE * Minimum Required: B.S. or B.A. Preferred in STEM (Science, Technology, Engineering, Math) field * Minimum Required: 10+ years of hands-on-experience in the design, development, and implementation of data solutions LICENSES AND CREDENTIALS * Minimum Required: None SYSTEMS AND TECHNOLOGY * Proficient in Microsoft Excel, Word, PowerPoint, Outlook * Experience working with the following: * Snowflake development and support * Advanced SQL knowledge with strong query writing skills * Object-oriented/object function scripting languages: Python, Java, Scala, etc. * AWS cloud services: EC2, EMR, RDS, DMS * Relational databases such as SQL Server and object relational databases such as PostgreSQL * Data analysis, ETL, and workflow automation * Multiple ETL/ELT tools and cloud-based data hubs such as Fivetran * Stream-processing systems: Kafka, Spark-Streaming, etc * Source code management and deployment tools (e.g., Git, Jenkins, dbt, Docker). * Infrastructure as Code (IaC) tools (e.g., Terraform, Ansible, CloudFormation) * Enterprise MDM solutions LOCATION * This position is located in St Louis, Missouri and offers a hybrid work schedule. Candidates living in Alabama, Arizona, Florida, Georgia, Illinois, Indiana, Kansas, Kentucky, Michigan, Minnesota, Missouri, New Jersey, N. Carolina, Ohio, Oklahoma, Pennsylvania, Texas and Virginia may also be considered for remote work. If you need assistance with this application, please contact **************. Please do not contact the office directly - only resumes submitted through this website will be considered. EyeCare Partners is an equal opportunity/affirmative action employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status.
    $77k-96k yearly est. Auto-Apply 40d ago
  • Sr Data Engineer

    Health Care Service Corporation 4.1company rating

    Richardson, TX jobs

    At HCSC, our employees are the cornerstone of our business and the foundation to our success. We empower employees with curated development plans that foster growth and promote rewarding, fulfilling careers. Join HCSC and be part of a purpose-driven company that will invest in your professional development. **Job Summary** This Position Is Responsible For Acting As A Concierge To The Divisional Analytics Teams (Dats). It Will Use Its Deep Knowledge And Understanding Of Business Needs To Help The Dats Define & Execute A Data Strategy. It Will Coordinate The Development & Delivery Of Data Assets And Serve As A Single Point Of Contact For Issues / Escalations To Business Stakeholders. Required Job Qualifications: * Bachelor degree and 8 years of data and analytics experience OR 12 years of data and analytics experience * Experience in data processing environments * Experience with Teradata & Hadoop * Experience connecting business requirements to data mining objectives and measuring business benefit * Knowledge of statistical modeling/predictive modeling/machine learning/data mining/optimization and big data concepts * Analytical and quantitative problem-solving skills * Experience with Data Warehouse development and data consumption * Experience with Business Intelligence tools * Knowledge of Application development platforms / languages * Knowledge of DevOps / continuous deployment / integration process * Ability to understand market needs and to translate business vision into data asset development * Ability to execute strategic vision with tactical efficiency * Experience interacting with corporate leadership * Ability to manage demands from multiple stakeholders while optimizing resources Preferred Job Qualifications: * Bachelor Degree in Computer Science, MIS or related field * Healthcare data experience **Are you being referred to one of our roles? If so, ask your connection at HCSC about our Employee Referral process!** **Pay Transparency Statement:** At Health Care Service Corporation, you will be part of an organization committed to offering meaningful benefits to our employees to support their life outside of work. From health and wellness benefits, 401(k) savings plan, pension plan, paid time off, paid parental leave, disability insurance, supplemental life insurance, employee assistance program, paid holidays, tuition reimbursement, plus other incentives, we offer a robust total rewards package for employees. Learn more about our benefit offerings by visiting ************************************* . The compensation offered will vary depending on your job-related skills, education, knowledge, and experience. This role aligns with an annual incentive bonus plan subject to the terms and the conditions of the plan. **HCSC Employment Statement:** We are an Equal Opportunity Employment employer dedicated to providing a welcoming environment where the unique differences of our employees are respected and valued. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other legally protected characteristics. **Base Pay Range** $130,800.00 - $242,800.00 Exact compensation may vary based on skills, experience, and location. **Join our talent community and receive the latest HCSC news, content, and be first in line for new job opportunities.** **Join our Talent Community. (******************************************** PA8v\_eHgqFiDb2AuRTqQ)** For more than 80 years, HCSC has been dedicated to expanding access to high-quality, cost-effective health care and equipping our members with information and tools to make the best health care decisions for themselves and their families. As an industry leader, HCSC also has been helping to make the health care system work better for all Americans. To remain a leader, we offer compelling careers that encourage resourcefulness, strategic thought and empower you to make a difference in the lives of our members and their communities. Today, with the industry at an important crossroad, HCSC is reimagining health care and looking for original thinkers who aren't afraid to make innovative contributions. We are an Equal Opportunity Employment employer dedicated to workforce diversity and a drug-free and smoke-free workplace. Learn more about HCSC, our commitment to our members and the opportunity you'll have to improve health care delivery in an open, collaborative environment. HCSC is committed to diversity in the workplace and to providing equal opportunity to employees and applicants. If you are an individual with a disability or a disabled veteran and need an accommodation or assistance in either using the Careers website or completing the application process, you can call us at ************** to request reasonable accommodations. Please note that only **requests for accommodations in the application process** will be returned. All applications, including resumes, must be submitted through HCSC's Career website on-line application process. If you have general questions regarding the status of an existing application, navigate to "candidate home" to view your job submissions. Blue Cross and Blue Shield of Illinois, Blue Cross and Blue Shield of Montana, Blue Cross and Blue Shield of New Mexico, Blue Cross and Blue Shield of Oklahoma, and Blue Cross and Blue Shield of Texas, Divisions of Health Care Service Corporation, a Mutual Legal Reserve Company, and Independent Licensee of the Blue Cross and Blue Shield Association © Copyright 2025 Health Care Service Corporation. All Rights Reserved.
    $130.8k-242.8k yearly 32d ago
  • Principal Data Engineer - ML Platforms

    Altarum 4.5company rating

    Remote

    Altarum | Data & AI Center of Excellence (CoE) Altarum is building the future of data and AI infrastructure for public health - and we're looking for a Principal Data Engineer - ML Platforms to help lead the way. In this cornerstone role, you will design, build, and operationalize the modern data and ML platform capabilities that power analytics, evaluation, AI modeling, and interoperability across all Altarum divisions. If you want to architect impactful systems, enable data science at scale, and help ensure public health and Medicaid programs operate with secure, explainable, and trustworthy AI - this role is for you. What You'll Work On This role blends deep engineering with applied ML enablement: ML Platform Engineering: modern lakehouse architecture, pipelines, MLOps lifecycle Applied ML enablement: risk scoring, forecasting, Medicaid analytics NLP/Generative AI support: RAG, vectorization, health communications Causal ML operationalization: evaluation modeling workflows Responsible/Trusted AI engineering: model cards, fairness, compliance Your work ensures that Altarum's public health and Medicaid programs run on secure, scalable, reusable, and explainable data and AI infrastructure. What You'll Do Platform Architecture & Delivery Design and operate modern, cloud-agnostic lakehouse architecture using object storage, SQL/ELT engines, and dbt. Build CI/CD pipelines for data, dbt, and model delivery (GitHub Actions, GitLab, Azure DevOps). Implement MLOps systems: MLflow (or equivalent), feature stores, model registry, drift detection, automated testing. Engineer solutions in AWS and AWS GovCloud today, with portability to Azure Gov or GCP. Use Infrastructure-as-Code (Terraform, CloudFormation, Bicep) to automate secure deployments. Pipelines & Interoperability Build scalable ingestion and normalization pipelines for healthcare and public health datasets, including: FHIR R4 / US Core (strongly preferred) HL7 v2 (strongly preferred) Medicaid/Medicare claims & encounters (strongly preferred) SDOH & geospatial data (preferred) Survey, mixed-methods, and qualitative data Create reusable connectors, dbt packages, and data contracts for cross-division use. Publish clean, conformed, metrics-ready tables for Analytics Engineering and BI teams. Support Population Health in turning evaluation and statistical models into pipelines. Data Quality, Reliability & Cost Management Define SLOs and alerting; instrument lineage & metadata; ensure ≥95% of data tests pass. Perform performance and cost tuning (partitioning, storage tiers, autoscaling) with guardrails and dashboards. Applied ML Enablement Build production-grade pipelines for risk prediction, forecasting, cost/utilization models, and burden estimation. Develop ML-ready feature engineering workflows and support time-series/outbreak detection models. Integrate ML assets into standardized deployment workflows. Generative AI Enablement Build ingestion and vectorization pipelines for surveys, interviews, and unstructured text. Support RAG systems for synthesis, evaluation, and public health guidance. Enable Palladian Partners with secure, controlled-generation environments. Causal ML & Evaluation Engineering Translate R/Stata/SAS evaluation code into reusable pipelines. Build templates for causal inference workflows (DID, AIPW, CEM, synthetic controls). Support operationalization of ARA's applied research methods at scale. Responsible AI, Security & Compliance Implement Model Card Protocol (MCP) and fairness/explainability tooling (SHAP, LIME). Ensure compliance with HIPAA, 42 CFR Part 2, IRB/DUA constraints, and NIST AI RMF standards. Enforce privacy-by-design: tokenization, encryption, least-privilege IAM, and VPC isolation. Reuse, Shared-Services, and Enablement Develop runbooks, architecture diagrams, repo templates, and accelerator code. Pair with data scientists, analysts, and SMEs to build organizational capability. Provide technical guidance for proposals and client engagements. Your First 90 Days - You will make a meaningful impact fast. Expected outcomes include: Platform skeleton operational: repo templates, CI/CD, dbt project, MLflow registry, tests. Two pipelines in production (e.g., FHIR → analytics and claims normalization). One end-to-end CoE lighthouse MVP delivered (ingestion → model → metrics → BI). Completed playbooks for GovCloud deployment, identity/secrets, rollback, and cost control. Success Metrics (KPIs) Pipeline reliability meeting SLA/SLO targets. ≥95% data tests passing across pipelines. MVP dataset onboarding ≤ 4 weeks. Reuse of platform assets across ≥3 divisions. Cost optimization and budget adherence. What You'll Bring 7-10+ years in data engineering, ML platform engineering, or cloud data architecture. Expert in Python, SQL, dbt, and orchestration tools (Airflow, Glue, Step Functions). Deep experience with AWS + AWS GovCloud. CI/CD and IaC experience (Terraform, CloudFormation). Familiarity with MLOps tools (MLflow, Sagemaker, Azure ML, Vertex AI). Ability to operate in regulated environments (HIPAA, 42 CFR Part 2, IRB). Preferred: Experience with FHIR, HL7, Medicaid/Medicare claims, and/or SDOH datasets. Databricks, Snowflake, Redshift, Synapse. Event streaming (Kafka, Kinesis, Event Hubs). Feature store experience. Observability tooling (Grafana, Prometheus, OpenTelemetry). Experience optimizing BI datasets for Power BI. Logistical Requirements At this time, we will only accept candidates who are presently eligible to work in the United States and will not require sponsorship. Our organization requires that all work, for the duration of your employment, must be completed in the continental U.S. unless required by contract. If you're near one of our offices (Arlington, VA; Silver Spring, MD; or Novi, MI), you'll join us in person one day every other month (6 times per year) for a fun, purpose-driven Collaboration Day. These days are filled with creative energy, meaningful connection, and team brainstorming! Must be able to work during Eastern Time unless approved by your manager. Employees working remotely must have a dedicated, ergonomically appropriate workspace free from distractions with a mobile device that allows for productive and efficient conduct of business. Altarum is a nonprofit organization focused on improving the health of individuals with fewer financial resources and populations disenfranchised by the health care system. We work primarily on behalf of federal and state governments to design and implement solutions that achieve measurable results. We combine our expertise in public health and health care delivery with technology development and implementation, practice transformation, training and technical assistance, quality improvement, data analytics, and applied research and evaluation. Our innovative solutions and proven processes lead to better value and health for all. Altarum is an equal opportunity employer that provides employment and opportunities to all qualified employees and applicants without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, or any other characteristic protected by applicable law.
    $72k-98k yearly est. Auto-Apply 14d ago
  • Data Engineer Senior

    Brigham and Women's Hospital 4.6company rating

    Somerville, MA jobs

    Site: Mass General Brigham Incorporated Mass General Brigham relies on a wide range of professionals, including doctors, nurses, business people, tech experts, researchers, and systems analysts to advance our mission. As a not-for-profit, we support patient care, research, teaching, and community service, striving to provide exceptional care. We believe that high-performing teams drive groundbreaking medical discoveries and invite all applicants to join us and experience what it means to be part of Mass General Brigham. Job Summary Summary Responsible for designing, developing, and maintaining the data architecture and infrastructure within an organization. This position plays a crucial role in managing large-scale data systems and ensuring the efficient flow, storage, and accessibility of data for various stakeholders, such as data analysts, data scientists, and business users. Essential Functions * Design, develop, and implement data pipelines and ETL/ELT code to support business requirements. * Work on cross-functional teams delivering enterprise solutions for internal and external clients. * Assume ownership for delivering code revisions and enhancements from design through development and production installation. * Maintain and optimize various components of the data pipeline architecture. * Become subject matter expert for internal and external data products Ensure design solutions can scale and meet technical standards and performance benchmarks. * Identify inefficient processes and develop recommendations and design solutions. * Lead code review sessions to validate technical solutions and facilitate knowledge sharing. Qualifications Education Bachelor's Degree Related Field of Study required Can this role accept experience in lieu of a degree? Yes Principal Responsibilities * Works with cross-functional teams to understand functional product requirements and deliver on strategic data and analytics initiatives. * Design, build, test and maintain architectures within the 'Quality Data Hub' of our cloud-based enterprise data warehouse. * Build ETL/ELT ingestions for OCMO related data and handle related monitoring and support duties. * Accrues advanced knowledge of the OCMO product domains and can quickly apply development strategies to meet the stakeholder requirements for the data products. * Define and implement the processes and tooling through which enterprise curated data models are built and maintained. * Partners with solutions architects, analytics engineers, and data visualization developers to understand data extraction and transformation needs and builds a platform and related processes suited to them. * Develop and enforce change management and versioning processes for code promotion. * In collaboration with lead analytic staff, architect and enforce thorough quality assurance, testing and peer review procedures to ensure the accuracy, reliability, and validity of end products. * Helps identify potential bottlenecks within the development lifecycle and propose high-level strategies to overcome them. * Triage, troubleshoot and resolve data issues from end users and internal team members. * Build and foster relationships with senior leadership/physicians and key program stakeholders to understand multifaceted business problems and develop analytical solutions to complex issues to satisfy reporting and analytical needs. * Supports other team members with promotion to production, ensures a consistent process across the team; informs other data engineers and domain team leads of new changes to promotion process. * Guide and train data engineers by familiarizing them with the data products they will use and providing them with relevant information about OCMO domains. * Follows MGB Digital guidelines and standard practices to leverage existing tools and methods to efficiently develop modeling solutions, ensure that work is not duplicated, and is appropriately transitioned between teams. * Uses the Mass General Brigham values to govern decisions, actions, and behaviors. These values guide how we get our work done: Patients, Affordability, Accountability & Service Commitment, Decisiveness, Innovation & Thoughtful Risk; and how we treat each other: Diversity & Inclusion, Integrity & Respect, Learning, Continuous Improvement & Personal Growth, Teamwork & Collaboration. * Other duties and responsibilities as assigned. Knowledge, Skills and Abilities * Prior experience working with healthcare data is strongly preferred. * Prior experience working with dbt is strongly preferred. * Ability to function effectively and independently in a fast-paced environment, organize and prioritize work independently, and meet tight deadlines. * Ability to manage multiple projects simultaneously, set priorities, and collaborate with team members and others throughout the organization. * Possess strong interpersonal skills to effectively communicate with cross functional teams including staff at all levels of the organization. * Willing to contribute to and foster a team player culture where all are encouraged and willing to share information accurately. * Able and motivated to mentor/train junior staff members. * Knowledge of agile principles and experience working within an agile team is preferred. * Practical problem-solving abilities, i.e. the ability to formulate hypotheses, test options and move forward in a fast-paced environment. * Excellent interpersonal skills, including strong customer service orientation and the ability to translate complex technical concepts to non-technical audiences. * Advanced SQL DML skills required, with comparable experience in writing data functions (e.g. TSQL procs, Snowflake UDFs, etc.). Experience * Bachelor's or master's degree in computer science, informatics, statistics, or related field and an interest in healthcare and the use of technology to support clinical care. * 5+ years' experience with data modelling, ETL/ELT development, or similar role working with complex SQL queries and data extraction/transformation. * Experience with cloud data warehousing environments such as Microsoft Azure and Snowflake are a plus * Experience with ETL/data modelling tools such as dbt, Informatica, and Ab Initio are a plus. * Must possess a strong background in data warehousing projects. * Must be able to identify, triage, and resolve or dispatch issues. * Must possess strong data analysis skills and be able to perform data analysis using SQL, SAS, or similar query languages. * Must possess strong oral and written communication skills. * Must be capable of working independently with limited to no supervision. * Must be willing to contribute to and foster a team player culture where all are encouraged and willing to share information accurately. Additional Job Details (if applicable) Working Model Requirements * Hybrid with onsite work required in office, candidate must be flexible for weekly or monthly business needs * M-F Eastern business hours required * On remote workdays, employees must use a stable, secure, and compliant workstation in a quiet environment. Teams video is required and must be accessed using MGB-provided equipment. Remote Type Hybrid Work Location 399 Revolution Drive Scheduled Weekly Hours 40 Employee Type Regular Work Shift Day (United States of America) Pay Range $92,102.40 - $134,056.00/Annual Grade 7 At Mass General Brigham, we believe in recognizing and rewarding the unique value each team member brings to our organization. Our approach to determining base pay is comprehensive, and any offer extended will take into account your skills, relevant experience if applicable, education, certifications and other essential factors. The base pay information provided offers an estimate based on the minimum job qualifications; however, it does not encompass all elements contributing to your total compensation package. In addition to competitive base pay, we offer comprehensive benefits, career advancement opportunities, differentials, premiums and bonuses as applicable and recognition programs designed to celebrate your contributions and support your professional growth. We invite you to apply, and our Talent Acquisition team will provide an overview of your potential compensation and benefits package. EEO Statement: Mass General Brigham Incorporated is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religious creed, national origin, sex, age, gender identity, disability, sexual orientation, military service, genetic information, and/or other status protected under law. We will ensure that all individuals with a disability are provided a reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. To ensure reasonable accommodation for individuals protected by Section 503 of the Rehabilitation Act of 1973, the Vietnam Veteran's Readjustment Act of 1974, and Title I of the Americans with Disabilities Act of 1990, applicants who require accommodation in the job application process may contact Human Resources at **************. Mass General Brigham Competency Framework At Mass General Brigham, our competency framework defines what effective leadership "looks like" by specifying which behaviors are most critical for successful performance at each job level. The framework is comprised of ten competencies (half People-Focused, half Performance-Focused) and are defined by observable and measurable skills and behaviors that contribute to workplace effectiveness and career success. These competencies are used to evaluate performance, make hiring decisions, identify development needs, mobilize employees across our system, and establish a strong talent pipeline.
    $92.1k-134.1k yearly Auto-Apply 46d ago
  • Data Engineer Senior

    Massachusetts Eye and Ear Infirmary 4.4company rating

    Somerville, MA jobs

    Site: Mass General Brigham Incorporated Mass General Brigham relies on a wide range of professionals, including doctors, nurses, business people, tech experts, researchers, and systems analysts to advance our mission. As a not-for-profit, we support patient care, research, teaching, and community service, striving to provide exceptional care. We believe that high-performing teams drive groundbreaking medical discoveries and invite all applicants to join us and experience what it means to be part of Mass General Brigham. Job Summary Summary Responsible for designing, developing, and maintaining the data architecture and infrastructure within an organization. This position plays a crucial role in managing large-scale data systems and ensuring the efficient flow, storage, and accessibility of data for various stakeholders, such as data analysts, data scientists, and business users. Essential Functions -Design, develop, and implement data pipelines and ETL/ELT code to support business requirements. -Work on cross-functional teams delivering enterprise solutions for internal and external clients. -Assume ownership for delivering code revisions and enhancements from design through development and production installation. -Maintain and optimize various components of the data pipeline architecture. -Become subject matter expert for internal and external data products Ensure design solutions can scale and meet technical standards and performance benchmarks. -Identify inefficient processes and develop recommendations and design solutions. -Lead code review sessions to validate technical solutions and facilitate knowledge sharing. Qualifications Education Bachelor's Degree Related Field of Study required Can this role accept experience in lieu of a degree? Yes Principal Responsibilities Works with cross-functional teams to understand functional product requirements and deliver on strategic data and analytics initiatives. Design, build, test and maintain architectures within the ‘Quality Data Hub' of our cloud-based enterprise data warehouse. Build ETL/ELT ingestions for OCMO related data and handle related monitoring and support duties. Accrues advanced knowledge of the OCMO product domains and can quickly apply development strategies to meet the stakeholder requirements for the data products. Define and implement the processes and tooling through which enterprise curated data models are built and maintained. Partners with solutions architects, analytics engineers, and data visualization developers to understand data extraction and transformation needs and builds a platform and related processes suited to them. Develop and enforce change management and versioning processes for code promotion. In collaboration with lead analytic staff, architect and enforce thorough quality assurance, testing and peer review procedures to ensure the accuracy, reliability, and validity of end products. Helps identify potential bottlenecks within the development lifecycle and propose high-level strategies to overcome them. Triage, troubleshoot and resolve data issues from end users and internal team members. Build and foster relationships with senior leadership/physicians and key program stakeholders to understand multifaceted business problems and develop analytical solutions to complex issues to satisfy reporting and analytical needs. Supports other team members with promotion to production, ensures a consistent process across the team; informs other data engineers and domain team leads of new changes to promotion process. Guide and train data engineers by familiarizing them with the data products they will use and providing them with relevant information about OCMO domains. Follows MGB Digital guidelines and standard practices to leverage existing tools and methods to efficiently develop modeling solutions, ensure that work is not duplicated, and is appropriately transitioned between teams. Uses the Mass General Brigham values to govern decisions, actions, and behaviors. These values guide how we get our work done: Patients, Affordability, Accountability & Service Commitment, Decisiveness, Innovation & Thoughtful Risk; and how we treat each other: Diversity & Inclusion, Integrity & Respect, Learning, Continuous Improvement & Personal Growth, Teamwork & Collaboration. Other duties and responsibilities as assigned. Knowledge, Skills and Abilities Prior experience working with healthcare data is strongly preferred. Prior experience working with dbt is strongly preferred. Ability to function effectively and independently in a fast-paced environment, organize and prioritize work independently, and meet tight deadlines. Ability to manage multiple projects simultaneously, set priorities, and collaborate with team members and others throughout the organization. Possess strong interpersonal skills to effectively communicate with cross functional teams including staff at all levels of the organization. Willing to contribute to and foster a team player culture where all are encouraged and willing to share information accurately. Able and motivated to mentor/train junior staff members. Knowledge of agile principles and experience working within an agile team is preferred. Practical problem-solving abilities, i.e. the ability to formulate hypotheses, test options and move forward in a fast-paced environment. Excellent interpersonal skills, including strong customer service orientation and the ability to translate complex technical concepts to non-technical audiences. Advanced SQL DML skills required, with comparable experience in writing data functions (e.g. TSQL procs, Snowflake UDFs, etc.). Experience Bachelor's or master's degree in computer science, informatics, statistics, or related field and an interest in healthcare and the use of technology to support clinical care. 5+ years' experience with data modelling, ETL/ELT development, or similar role working with complex SQL queries and data extraction/transformation. Experience with cloud data warehousing environments such as Microsoft Azure and Snowflake are a plus Experience with ETL/data modelling tools such as dbt, Informatica, and Ab Initio are a plus. Must possess a strong background in data warehousing projects. Must be able to identify, triage, and resolve or dispatch issues. Must possess strong data analysis skills and be able to perform data analysis using SQL, SAS, or similar query languages. Must possess strong oral and written communication skills. Must be capable of working independently with limited to no supervision. Must be willing to contribute to and foster a team player culture where all are encouraged and willing to share information accurately. Additional Job Details (if applicable) Working Model Requirements Hybrid with onsite work required in office, candidate must be flexible for weekly or monthly business needs M-F Eastern business hours required On remote workdays, employees must use a stable, secure, and compliant workstation in a quiet environment. Teams video is required and must be accessed using MGB-provided equipment. Remote Type Hybrid Work Location 399 Revolution Drive Scheduled Weekly Hours 40 Employee Type Regular Work Shift Day (United States of America) Pay Range $92,102.40 - $134,056.00/Annual Grade 7 At Mass General Brigham, we believe in recognizing and rewarding the unique value each team member brings to our organization. Our approach to determining base pay is comprehensive, and any offer extended will take into account your skills, relevant experience if applicable, education, certifications and other essential factors. The base pay information provided offers an estimate based on the minimum job qualifications; however, it does not encompass all elements contributing to your total compensation package. In addition to competitive base pay, we offer comprehensive benefits, career advancement opportunities, differentials, premiums and bonuses as applicable and recognition programs designed to celebrate your contributions and support your professional growth. We invite you to apply, and our Talent Acquisition team will provide an overview of your potential compensation and benefits package. EEO Statement: Mass General Brigham Incorporated is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religious creed, national origin, sex, age, gender identity, disability, sexual orientation, military service, genetic information, and/or other status protected under law. We will ensure that all individuals with a disability are provided a reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. To ensure reasonable accommodation for individuals protected by Section 503 of the Rehabilitation Act of 1973, the Vietnam Veteran's Readjustment Act of 1974, and Title I of the Americans with Disabilities Act of 1990, applicants who require accommodation in the job application process may contact Human Resources at **************. Mass General Brigham Competency Framework At Mass General Brigham, our competency framework defines what effective leadership “looks like” by specifying which behaviors are most critical for successful performance at each job level. The framework is comprised of ten competencies (half People-Focused, half Performance-Focused) and are defined by observable and measurable skills and behaviors that contribute to workplace effectiveness and career success. These competencies are used to evaluate performance, make hiring decisions, identify development needs, mobilize employees across our system, and establish a strong talent pipeline.
    $92.1k-134.1k yearly Auto-Apply 48d ago
  • Data Engineer

    Certified Laboratories 4.2company rating

    San Antonio, TX jobs

    About the Role We are seeking an experienced Data Engineer / Database Administrator with a strong background in the healthcare or laboratory industry to design, build, and maintain secure, compliant, and high-performance data systems. This role requires expertise in the Microsoft technology stack, including Azure, Synapse Analytics, and SQL Server, as well as a deep understanding of data governance and interoperability in healthcare environments. You will play a key role in developing and optimizing data infrastructure that supports clinical operations, reporting, analytics, and regulatory compliance. Key Responsibilities Data Engineering * Design, build, and maintain robust data pipelines using Azure Data Factory, Synapse Pipelines, and related services to ingest and transform large healthcare datasets. * Develop scalable data warehouse and data lake solutions on Azure Synapse to support analytics, machine learning, and business intelligence initiatives. * Create and maintain data models optimized for performance, reliability, and scalability. * Collaborate with laboratory information systems (LIS), electronic health records (EHR), and other healthcare data sources to ensure seamless integration and interoperability. * Implement data quality, validation, and auditing processes to ensure integrity across all data assets. Database Administration * Administer and optimize Microsoft SQL Server databases across on-premise and cloud environments. * Design and tune indexes, stored procedures, and queries for optimal performance. * Manage database security, roles, and access controls to maintain compliance with HIPAA, HITECH, and organizational standards. * Oversee backup, recovery, and disaster recovery strategies for mission-critical healthcare systems. * Monitor database performance and availability using tools like Azure Monitor, SQL Profiler, and Log Analytics. Collaboration & Strategy * Partner with data scientists, analysts, clinicians, and application developers to support reporting and analytics use cases. * Define and enforce data architecture standards and documentation within the organization. * Stay up to date with Microsoft's evolving data ecosystem, recommending tools and practices that enhance scalability, performance, and compliance. Qualifications * Bachelor's degree in Computer Science, Information Systems, or related field (Master's preferred). * 5+ years of experience as a Data Engineer, Database Administrator, or similar role, preferably in a healthcare or laboratory environment. * Strong proficiency in T-SQL, SSIS, and Azure Synapse Analytics. * Proven experience with Azure Data Factory, Azure Data Lake Storage, Azure SQL Database, and Power BI. * In-depth understanding of data modeling, ETL development, and data warehouse design within Microsoft ecosystems. * Solid grasp of data security, encryption, and compliance standards in healthcare (HIPAA, PHI, HL7, FHIR). * Experience with CI/CD for data solutions using tools such as Azure DevOps or GitHub Actions. * Excellent problem-solving skills, attention to detail, and ability to communicate complex technical concepts to non-technical stakeholders. Preferred Skills * Experience integrating data from LIS, EHR, or LIMS platforms. * Familiarity with Python or .NET for data automation and custom integrations. * Knowledge of Power BI administration and data governance frameworks (Purview, Collibra, etc.). * Exposure to machine learning pipelines in Azure ML or Databricks environments. #LI-Remote
    $81k-115k yearly est. 13d ago
  • Data Engineer

    Certified Laboratories Inc. 4.2company rating

    San Antonio, TX jobs

    Job Description About the Role We are seeking an experienced Data Engineer / Database Administrator with a strong background in the healthcare or laboratory industry to design, build, and maintain secure, compliant, and high-performance data systems. This role requires expertise in the Microsoft technology stack, including Azure, Synapse Analytics, and SQL Server, as well as a deep understanding of data governance and interoperability in healthcare environments. You will play a key role in developing and optimizing data infrastructure that supports clinical operations, reporting, analytics, and regulatory compliance. Key Responsibilities Data Engineering • Design, build, and maintain robust data pipelines using Azure Data Factory, Synapse Pipelines, and related services to ingest and transform large healthcare datasets. • Develop scalable data warehouse and data lake solutions on Azure Synapse to support analytics, machine learning, and business intelligence initiatives. • Create and maintain data models optimized for performance, reliability, and scalability. • Collaborate with laboratory information systems (LIS), electronic health records (EHR), and other healthcare data sources to ensure seamless integration and interoperability. • Implement data quality, validation, and auditing processes to ensure integrity across all data assets. Database Administration • Administer and optimize Microsoft SQL Server databases across on-premise and cloud environments. • Design and tune indexes, stored procedures, and queries for optimal performance. • Manage database security, roles, and access controls to maintain compliance with HIPAA, HITECH, and organizational standards. • Oversee backup, recovery, and disaster recovery strategies for mission-critical healthcare systems. • Monitor database performance and availability using tools like Azure Monitor, SQL Profiler, and Log Analytics. Collaboration & Strategy • Partner with data scientists, analysts, clinicians, and application developers to support reporting and analytics use cases. • Define and enforce data architecture standards and documentation within the organization. • Stay up to date with Microsoft's evolving data ecosystem, recommending tools and practices that enhance scalability, performance, and compliance. Qualifications • Bachelor's degree in Computer Science, Information Systems, or related field (Master's preferred). • 5+ years of experience as a Data Engineer, Database Administrator, or similar role, preferably in a healthcare or laboratory environment. • Strong proficiency in T-SQL, SSIS, and Azure Synapse Analytics. • Proven experience with Azure Data Factory, Azure Data Lake Storage, Azure SQL Database, and Power BI. • In-depth understanding of data modeling, ETL development, and data warehouse design within Microsoft ecosystems. • Solid grasp of data security, encryption, and compliance standards in healthcare (HIPAA, PHI, HL7, FHIR). • Experience with CI/CD for data solutions using tools such as Azure DevOps or GitHub Actions. • Excellent problem-solving skills, attention to detail, and ability to communicate complex technical concepts to non-technical stakeholders. Preferred Skills • Experience integrating data from LIS, EHR, or LIMS platforms. • Familiarity with Python or .NET for data automation and custom integrations. • Knowledge of Power BI administration and data governance frameworks (Purview, Collibra, etc.). • Exposure to machine learning pipelines in Azure ML or Databricks environments. #LI-Remote
    $81k-115k yearly est. 13d ago
  • Data Engineer

    Brigham and Women's Hospital 4.6company rating

    Somerville, MA jobs

    Site: Mass General Brigham Incorporated Mass General Brigham relies on a wide range of professionals, including doctors, nurses, business people, tech experts, researchers, and systems analysts to advance our mission. As a not-for-profit, we support patient care, research, teaching, and community service, striving to provide exceptional care. We believe that high-performing teams drive groundbreaking medical discoveries and invite all applicants to join us and experience what it means to be part of Mass General Brigham. Job Summary The Data Engineer will be a member of the Psychiatry Neuroimaging Laboratory (PNL), an active neuroimaging research group at Brigham and Women's Hospital focused on understanding brain abnormalities and their role in neuropsychiatric disorders using state-of-the-art neuroimaging techniques. The team consists of a dynamic, interdisciplinary and international group investigating the role of brain abnormalities in a variety of brain disorders. This research group is also actively developing new technology to characterize brain structure and function, which has led to the design of state-of-the-art image analysis pipelines capable of robustly processing hundreds of neuroimaging datasets. The group has been continuously funded through grant support from the National Institute of Mental Health, the National Institute of Neurological Disorders and Stroke, the Eunice Kennedy Shriver National Institute of Child Health and Human Development, the Department of Defense, the Veterans Medical Administration, and a variety of other public and private foundations. Qualifications Relevant activities include, but are not limited to the following: * Maintain and enhance existing image processing pipelines. * Design new image processing pipelines, with an emphasis on version tracking, data provenance, and high performance computing. * Develop neuroinformatics tools to track data provenance and project management. * Test and evaluate a range of neuroimaging packages to determine their suitability for research goals. * Regular, direct interaction with neuroscientists from within and outside the lab to assist them with neuroimaging data analysis using a range of methods including FSL, SPM, 3DSlicer, MNE and other specialized tools. * Design, implement, test, maintain and support applications to capture, manage, archive and monitor multi-site, multi-modal study data. Applications may include but are not limited to study monitoring systems, data management systems, workflow execution and monitoring systems, interactive viewers, and reporting tools. * Support web application deployment and server configuration. * Support data engineering efforts, including database and API design, data extraction/transformation/load, and data aggregation/integration, graphical user interface design. * Containerize (Docker/Singularity) and deploy software on local high performance computing platforms and cloud computing infrastructure (AWS/Azure). Qualifications: Required: * Bachelor's Degree in Computer Science, Biomedical Engineering, Bioinformatics, Electrical Engineering, Data Science, or a related field. * Excellent programming skills in Python, Bash, MATLAB. * Superior Linux/Unix skills and comfort with command line programs - the ability to get new programs and packages running, overcoming hurdles as they arise, is particularly helpful. * Familiarity with standard software evolution method-version controlling (Git), pull requests, code reviews, issue and release management. * Ability to work in an interdisciplinary, diverse, and international team in a highly collaborative and intellectually challenging environment. * Excellent oral and written communication skills. Preferred: * Basic knowledge of neuroscience and neuroanatomy. * Understanding of structural, diffusion, and functional Magnetic Resonance Imaging and electroencephalography (EEG) * Master's degree in Computer Science, Biomedical Engineering, Bioinformatics, Electrical Engineering, Data Science, or a related field. * Familiarity with C/C++ programming. * Experience in neuroimaging software FSL, FreeSurfer, 3DSlicer, DIPY, Nipype, MNE, Neurodocker, REDCap, XNAT. * Experience with database management systems (e.g., SQL, PostgreSQL, MongoDB, CouchDB). * Experience with at least one web framework for building web applications (e.g., Plotly, WordPress, HTML/CSS, React). Additional Job Details (if applicable) Physical Requirements * Standing Occasionally (3-33%) * Walking Occasionally (3-33%) * Sitting Constantly (67-100%) * Lifting Occasionally (3-33%) 20lbs - 35lbs * Carrying Occasionally (3-33%) 20lbs - 35lbs * Pushing Rarely (Less than 2%) * Pulling Rarely (Less than 2%) * Climbing Rarely (Less than 2%) * Balancing Occasionally (3-33%) * Stooping Occasionally (3-33%) * Kneeling Rarely (Less than 2%) * Crouching Rarely (Less than 2%) * Crawling Rarely (Less than 2%) * Reaching Occasionally (3-33%) * Gross Manipulation (Handling) Constantly (67-100%) * Fine Manipulation (Fingering) Frequently (34-66%) * Feeling Constantly (67-100%) * Foot Use Rarely (Less than 2%) * Vision - Far Constantly (67-100%) * Vision - Near Constantly (67-100%) * Talking Constantly (67-100%) * Hearing Constantly (67-100%) Remote Type Onsite Work Location 399 Revolution Drive Scheduled Weekly Hours 40 Employee Type Regular Work Shift Day (United States of America) Pay Range $73,798.40 - $107,400.80/Annual Grade 6 At Mass General Brigham, we believe in recognizing and rewarding the unique value each team member brings to our organization. Our approach to determining base pay is comprehensive, and any offer extended will take into account your skills, relevant experience if applicable, education, certifications and other essential factors. The base pay information provided offers an estimate based on the minimum job qualifications; however, it does not encompass all elements contributing to your total compensation package. In addition to competitive base pay, we offer comprehensive benefits, career advancement opportunities, differentials, premiums and bonuses as applicable and recognition programs designed to celebrate your contributions and support your professional growth. We invite you to apply, and our Talent Acquisition team will provide an overview of your potential compensation and benefits package. EEO Statement: Mass General Brigham Incorporated is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religious creed, national origin, sex, age, gender identity, disability, sexual orientation, military service, genetic information, and/or other status protected under law. We will ensure that all individuals with a disability are provided a reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. To ensure reasonable accommodation for individuals protected by Section 503 of the Rehabilitation Act of 1973, the Vietnam Veteran's Readjustment Act of 1974, and Title I of the Americans with Disabilities Act of 1990, applicants who require accommodation in the job application process may contact Human Resources at **************. Mass General Brigham Competency Framework At Mass General Brigham, our competency framework defines what effective leadership "looks like" by specifying which behaviors are most critical for successful performance at each job level. The framework is comprised of ten competencies (half People-Focused, half Performance-Focused) and are defined by observable and measurable skills and behaviors that contribute to workplace effectiveness and career success. These competencies are used to evaluate performance, make hiring decisions, identify development needs, mobilize employees across our system, and establish a strong talent pipeline.
    $73.8k-107.4k yearly Auto-Apply 6d ago
  • Data & BI Engineer

    L R S 4.3company rating

    Springfield, MO jobs

    LRS is seeking a skilled Data & BI Engineer to design, build, and maintain scalable data solutions that power analytics and reporting across the organization. This hybrid role combines data architecture and engineering (integration, pipelines, modeling) with BI development (dashboards, visualizations, and insights). The ideal candidate is comfortable working across the full data stack and collaborating with business stakeholders to deliver actionable intelligence. This is an in-office position based out of our headquarters in Springfield, Illinois. Requirements 5+ years of experience in data engineering and data analytics. Proficiency in SQL Server and T-SQL. Experience with data modeling (star/snowflake schemas) and ETL/ELT processes. Experience with BI tools. Strong understanding of data governance, security, and performance optimization. Excellent communication and stakeholder engagement skills. The following will make you a stronger candidate Experience working with data warehouses. Experience with Power BI, DAX, and Power Query. Familiarity with Microsoft Fabric. Key Responsibilities Design and implement data pipelines using modern ETL/ELT tools. Develop and maintain semantic data models optimized for reporting and analytics. Build compelling Power BI dashboards and reports with functional, user-friendly visuals. Collaborate with business units to understand data needs and translate them into technical solutions. Ensure data quality, integrity, and governance across systems. Optimize performance of data solutions and BI assets. Support data integration across cloud and on-premises systems. Document architecture, data flows, and reporting logic. Success Factors The successful candidate will demonstrate expertise across the data stack, delivering reliable, high-quality data solutions and actionable insights. Success in this role will be measured by your ability to collaborate with business stakeholders, optimize data-driven processes, and drive impactful analytics initiatives. Organization Structure The LRS IT team consists of a Chief Information Officer, Director of IT, Director of Applications, Director of Information Security, and teams for networking, infrastructure, cloud, communications, end-user services, and applications. The team is based in Springfield, IL and manages the global operations at LRS. You will report to the Chief Information Officer. LRS is an equal opportunity employer. Applicants for employment will receive consideration without unlawful discrimination based on race, color, religion, creed, national origin, sex, age, disability, marital status, gender identity, domestic partner status, sexual orientation, genetic information, citizenship status or protected veteran status. Salary Range: $90,000-$130,000. This salary range represents the low and high end for this position. The salary will vary depending on factors including experience and skills. The range listed is just one component of LRS' total employee compensation, as we have a generous benefits package.
    $90k-130k yearly 60d+ ago
  • Data Platform Engineer

    Monogram Health 3.7company rating

    Brentwood, TN jobs

    Data Platform Engineer The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions. Responsibilities * Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms. * Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks. * Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark. * Develop and manage data models, data warehousing solutions, and data integration architectures in Azure. * Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems. * Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration. * Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases. * Ensure data quality, governance, and security across the data lifecycle. * Collaborate with product managers by estimating technical tasks and deliverables. * Uphold the mission and values of Monogram Health in all aspects of your role and activities. Position Requirements * A bachelor's degree in computer science, data science, software engineering or related field. * Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark. * Expert level knowledge of Python or other scripting languages required. * Proficiency in SQL and other data query languages. * Understanding of data modeling and schema design principles * Ability to work with large datasets and perform data analysis * Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable. * Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC). * Thorough understanding of Azure Cloud Infrastructure offerings. * Demonstrated problem-solving and troubleshooting skills. * Team player with demonstrated written and communication skills. Benefits * Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts * Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources * Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave * Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders. Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home. Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
    $75k-103k yearly est. 47d ago
  • Data Platform Engineer

    Monogram Health Inc. 3.7company rating

    Brentwood, TN jobs

    Job DescriptionPosition: Data Platform Engineer The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions. Responsibilities Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms. Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks. Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark. Develop and manage data models, data warehousing solutions, and data integration architectures in Azure. Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems. Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration. Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases. Ensure data quality, governance, and security across the data lifecycle. Collaborate with product managers by estimating technical tasks and deliverables. Uphold the mission and values of Monogram Health in all aspects of your role and activities. Position Requirements A bachelor's degree in computer science, data science, software engineering or related field. Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark. Expert level knowledge of Python or other scripting languages required. Proficiency in SQL and other data query languages. Understanding of data modeling and schema design principles Ability to work with large datasets and perform data analysis Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable. Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC). Thorough understanding of Azure Cloud Infrastructure offerings. Demonstrated problem-solving and troubleshooting skills. Team player with demonstrated written and communication skills. Benefits Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders. Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home. Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
    $75k-103k yearly est. 24d ago
  • Data Scientist

    Signature Science, LLC 4.4company rating

    New Jersey jobs

    The primary purpose of this position is to serve as the data scientist with a split portfolio between the Atlantic City office and the Austin chemistry group. Essential Duties and Responsibilities: Performs data analytics, specifically data clean-up, data processing, predictive modeling, chemometric statistical modeling and analysis, multivariate data analysis, machine learning, and/or data mining, as related to scientific data. Applies technical skills to plan and execute assigned project work including development of computational models, programming of detection algorithms, and machine learning. Maintains operational capabilities of computation assets as needed by project requirements. Leads meetings with company clients by preparing and presenting meeting materials in meetings. Appropriately annotates project developed computer code through comments and user manuals. Presents technical results through the drafting of technical reports. Presents experimental results and recommended actions at internal project meetings. Supports business development efforts as needed by drafting technical sections of proposals, providing proposal review, assessing levels of effort required to complete proposed work, and brainstorming technical solutions to client problems. Other duties as assigned. Required Knowledge, Skills & Abilities: Ability to plan sequence of experiments to answer complicated technical questions Ability to lead group of co-workers in execution of a task Software programming proficiency with Java, C, R, Python, and/or MATLAB Working knowledge of statistics as it applies to scientific data Ability to communicate technical information to non-technical audiences Team player with a positive attitude Department of Homeland Security Suitability Department of Defense Secret Clearance Working knowledge of software development practices including Agile development and Git version control Sufficient business knowledge to support proposal efforts Education/Experience: Incumbent professional should have a Ph.D. or master's degree in a physical science (preferably chemistry), statistics, or data science and significant experience in computer programming, computational modeling, or software development. Certificates and Licenses: None Clearance: The ability to obtain a Secret clearance and Department of Homeland Security suitability is required for this position. Supervisory Responsibilities: The incumbent professional may oversee junior level staff members performing tasks. Working Conditions/ Equipment: The incumbent professional is expected to work and/or be available during regular business hours. He/she should also generally be available via e-mail or phone during non-business hours as needed to address critical issues or emergencies. He/she may be required to travel on behalf of the company up to 25%. The above job description is not intended to be an all-inclusive list of duties and standards of the position. Incumbents will follow any other instructions and perform any other related duties, as assigned by their supervisor. Powered by ExactHire:160573
    $72k-99k yearly est. 24d ago

Learn more about Zelis jobs