Senior Software Engineer - Full Stack & DevOps
Huntington Beach, CA jobs
We're seeking a Senior Software Engineer who thrives at the intersection of application development and DevOps. You'll design, build, and deploy scalable SaaS solutions for Medicare and Medicaid health plans, while also contributing to the automation, reliability, and security of our development lifecycle. This role is central to delivering high-quality features for our Compliance, Appeals & Grievances, and Universe Scrubber products.
Key Responsibilities:
· Application Development
Design and implement backend services, APIs, and user interfaces using modern frameworks and cloud-native architecture. Ensure performance, scalability, and maintainability across the stack.
· DevOps Integration
Collaborate with infrastructure and DevOps teams to build and maintain CI/CD pipelines, automate deployments, and optimize environment provisioning across development, QA, and production.
· Cloud-Native Engineering
Develop and deploy applications on AWS, leveraging services like Lambda, ECS, RDS, and S3. Ensure solutions are secure, resilient, and compliant with healthcare regulations.
· Quality & Compliance
Write clean, testable code and participate in peer reviews, unit testing, and performance tuning. Ensure all software adheres to CMS, HIPAA, and internal compliance standards.
· AI-Enabled Features
Support integration of AI/ML capabilities into product workflows, such as intelligent routing of grievances or automated compliance checks.
· Mentorship & Collaboration
Provide technical guidance to junior engineers and collaborate with cross-functional teams to translate healthcare business needs into technical solutions.
Qualifications:
Bachelor's degree in computer science or related field
5+ years of experience in software development, with exposure to DevOps practices
Proficiency in languages such as Java, Python, or C#, and experience with cloud platforms (preferably AWS)
Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions), infrastructure-as-code (e.g., Terraform, Ansible), and containerization (e.g., Docker, Kubernetes)
Understanding of healthcare data formats (EDI, HL7, FHIR) and regulatory frameworks
Senior Full-Stack Golang Developer
Houston, TX jobs
Job Title: Senior Full-Stack Golang Developer
Company: Aarista/ Altea Healthcare IT
Job Type: Full-Time
Compensation Range: $110,000-$140,000 USD depending on experience
Our mission is to improve outcomes for Chronic Care patients who are dependent on multiple daily medications. Our proprietary and vertically integrated EMR technology solutions enable providers to enhance medication adherence through improved access, owned physician network, information
We are looking for a Senior Full-Stack Developer and Lead. This person will play a key role on the core development team that is working on supporting and building our next generation suite of products, Revenue Cycle Management system. As a member of our core development team, this person will contribute significantly to designing and implementing various product features. In addition to bringing their experience building using the Microsoft stack, this role will also require learning and implementing solutions using other technologies on an as needed basis. We are an exciting healthcare startup company, so we need someone that is agile since changes are expected.
Your Role
Support, design and develop RCM software covering the full stack Golang, React (TypeScript), Mongo DB, Azure data bricks and azure data lake
Brainstorm with your team to conceptualize and build new features.
Experience with the Azure-based infrastructure and help us to leverage cloud technologies to ensure we can scale in line with customer adoption.
Partner with business analysts and other developers in order to fully understand product requirements and implement solutions which meet these requirements.
Provide technical leadership including architecture design, coding, code review, practices and skills development.
You
You thrive in a team environment but can also work independently.
You are passionate about using your technical knowledge and skills to solve real business problems and are motivated by understanding the value that your work adds.
A self-starter that can manage their own workload and an ever-growing task list.
A team player and leader.
Problem solving of potential roadblocks which could potentially impact patient care, strategic, and technical goals of the business.
Very proficient with server-side languages Golang
Proficiency with front-end React, Typescript, Javascript
Knowledge with Azure data bricks and data lake
Workding knowledge of relational databases such as SQL Server, Azure SQL
You are passionate about creating innovating and exciting new technology and want to provide end users with the best possible experience.
Have experience with Software development Lifecycle (SDLC) including system requirements collection, architecture, design, development, testing, maintenance and enhancement across a variety of technologies.
Skills
Required Experience:
Golang
React Front End - Typescript and Javascript
Mongo DB
Azure data
Azure data lake
Mongo DB
Solid web service: RESTful and SOAP
Nice to have:
MS SQL, Azure SQL (SQL Server)
Data modeling, UML and Design Patterns
Azure experience
Job Types: Full-time
Pay: $110,000-$140,000 USD depending on experience
Schedule: Full Time
Senior Frontend Developer
Irving, TX jobs
We are looking for an experienced Frontend Developer (ReactJS) to join our Digital Engineering Division.
Who are we?
For the past 20 years, we have powered many Digital Experiences for the Fortune 100. Since 1999, we have grown from a few people to more than 6000 team members across the globe that are engaged in various Digital Modernization.
For a brief 1 minute video about us, you can check *****************************
What are we looking for?
Develop, test, and deploy responsive web applications using React.js and modern JavaScript frameworks.
Collaborate with UI/UX designers and backend developers to deliver seamless user experiences.
Optimize applications for performance, scalability, and maintainability.
Build reusable components and front-end libraries for future use.
Integrate RESTful APIs and work closely with backend teams to ensure efficient data flow.
Debug, troubleshoot, and resolve production issues.
Stay updated with emerging technologies and industry trends to improve development practices
Senior Data Engineer
Chicago, IL jobs
Midtown is seeking a Senior Data Engineer to join our world-class team at our Chicago headquarters.
The team is based in our HQ office in Chicago (3611 N Kedzie Ave.) and supports all club locations.
The role is hybrid work-from-home and required to also come in the office in Chicago 2 days per week (Monday/Tuesday)
The position is based in the Chicago area and will involve very limited travel to Midtown club locations
About Our Company
We work at Midtown to inspire people to transform their lives-and we do our job well. Our members stay longer than any other major athletic club chain in North America because we are committed to providing resort-like environments, personal attention, and strong communities at every one of our clubs. We believe all three of those pillars start with attracting and growing rock star talent at every level of our organization.
Who We Want
We are looking for people that share our core values: kind individuals who want to win together, see things as the glass half full, are passionate about helping others, and strive to always be better than yesterday.
The Position
The Senior Data Engineer is a key leader in advancing Midtown's enterprise data architecture and driving innovation across reporting, machine learning, and AI initiatives. They will design and implement scalable, secure, and high-performing data pipelines and infrastructure that support a wide range of data-driven use cases. Collaborating closely with analysts and business stakeholders, they will ensure the availability and usability of data for analytics and modeling, while contributing to the organization's long-term data strategy. This role is instrumental in enabling actionable insights and fostering a modern, resilient data ecosystem.
The primary responsibilities are:
Architect and evolve data infrastructure to support enterprise reporting, ML, and AI use cases.
Design, build, and maintain data pipelines that ingest, transform, and deliver data from diverse sources.
Implement best practices for data governance, security, and compliance.
Continuously improve data engineering processes, including automation, orchestration, and monitoring.
Mentor and guide engineering peers on architecture standards and development practices.
Collaborate with analysts and stakeholders to ensure data usability and accessibility.
Stay current with industry trends in cloud platforms, AI/ML tooling, and the modern data stack.
Support reporting and visualization efforts through semantic layers and curated datasets.
Qualifications & Experience:
Bachelor's degree in Engineering, Information Technology, or equivalent experience.
5+ years of experience designing and implementing data solutions on Microsoft Azure (e.g., Azure Data Factory, Databricks, Power Platform).
Strong proficiency in SQL, Python, and cloud-native data tools.
Experience with data modeling, ETL/ELT pipelines, and modern data warehousing.
Familiarity with ML/AI workflows (e.g., feature engineering, model deployment, data versioning) is a plus.
Experience with BI tools such as Power BI; report writing experience is a plus.
Working knowledge of agile development methodologies.
Proficiency with collaboration tools (e.g., Azure DevOps, JIRA, MS Teams).
Strong communication, problem-solving, and multitasking skills.
Eager to work with multiple teams and projects at the same time.
Associate Benefits
Team Members of the Midtown team receive:
Complimentary club membership
Discounts on Midtown products and services
Access to hundreds of free courses for professional development
Health insurance for eligible full-time associates (30+ hours a week)
This is intended to describe the general requirements for the position. It is not a complete statement of duties, responsibilities, or requirements. Other duties not listed here may be assigned as necessary to ensure the proper operations of the department.
Salary Range
$130,000 - $150,000. The actual compensation will depend on experience, and/or additional skills you bring to the table.
Benefits
Please refer to the link here for a copy of benefits and perks offered by Midtown for our full and part time associates. You may also visit: **********************************************
This job description is intended to describe the general requirements for the position. It is not a complete statement of duties, responsibilities or requirements. Other duties not listed here may be assigned as necessary to ensure the proper operations of the department.
MIDTOWN is an Equal Opportunity Employer.
Auto-ApplyPrincipal Data Engineer
Ballwin, MO jobs
We are seeking an experienced professional who will serve as the Principal Data Engineer on our Data Platforms & Insights team. The Principal Data Engineer serves as a senior technical leader within the Data Platforms & Insights team, responsible for architecting, developing, and maintaining scalable data solutions that support enterprise-wide analytics, reporting, and data management initiatives. This role drives the design and implementation of robust data pipelines, ensures data quality and governance, and enables self-service analytics through a "Data as a Service" model. The Principal Data Engineer collaborates closely with cross-functional teams, business stakeholders, and third-party service providers to deliver high-impact data solutions, while also mentoring and supervising Data Engineers to uphold engineering standards and best practices.
ESSENTIAL DUTIES AND RESPONSIBILITIES
* Design, develop, and maintain scalable and efficient data pipelines using ETL tools and programming languages
* Develop integration solutions leveraging APIs to enable seamless communication between systems.
* Analyze data elements from various systems, data flow, dependencies, relationships and assist in designing conceptual physical and logical data models
* Implement data solutions across on-prem and cloud environments, ensuring performance, reliability, and scalability
* Ensure all data pipelines follow established data governance rules for data quality and completeness
* Maintain and evolve existing monitoring, logging, and alerting frameworks for proactively managing and troubleshooting data pipelines
* Manage source code repositories and deployment processes using modern tools
* Utilize Infrastructure as Code (IaC) tools to automate and manage infrastructure provisioning
* Work within Agile development framework to understand and transform business requirements into scalable and manageable solutions
* Work with various business and technical stakeholders and assist with data-related technical needs and issues
* Partner with leadership to define and evolve the long-term data architecture and engineering strategy, ensuring alignment with business goals
* Present solutions and options to leadership, project teams and other stakeholders adapting style to both technical and non-technical audiences
* Establish and enforce documentation standards for data pipelines, schemas, and infrastructure
* Ensures data engineers and other technical teams adhere to documented design and development patterns and standards
* Conduct code reviews and provide guidance to other developers, fostering growth and development within the team
* Proactively monitor and resolve on-going production issues data pipelines, databases, and infrastructure
* Educate organization on latest trends and technologies in data engineering, APIs, and streaming data
* Lead team on establishing industry best practices in data engineering to ensure high-quality deliverables
* Adheres to all safety policies and procedures in performing job duties and responsibilities while supporting a culture of high quality and great customer service.
* Performs other duties that may be necessary or in the best interest of the organization.
QUALIFICATIONS
* Demonstrated ability to work efficiently and effectively in a fast-paced, matrixed environment, and ability to execute despite ambiguity
* Previous experience with a Healthcare company preferred
* Enjoys learning new technologies and systems
* Exhibits a positive attitude and is flexible in accepting work assignments and priorities
* Interpersonal skills to support customer service, functional, and teammate support need
* Knowledge of state and federal regulations for this position; general understanding of HIPAA guidelines
SUPERVISORY RESPONSIBILITIES
* Directly supervises Data Engineers on the Data Platforms & Insights team
* Carries out supervisory responsibilities in accordance with the organization's policies and applicable laws.
* Responsibilities include interviewing, hiring, and training employees, planning, assigning, and directing work; appraising performance, rewarding and disciplining employees, addressing complaints and resolving problems.
EDUCATION AND/OR EXPERIENCE
* Minimum Required: B.S. or B.A. Preferred in STEM (Science, Technology, Engineering, Math) field
* Minimum Required: 10+ years of hands-on-experience in the design, development, and implementation of data solutions
LICENSES AND CREDENTIALS
* Minimum Required: None
SYSTEMS AND TECHNOLOGY
* Proficient in Microsoft Excel, Word, PowerPoint, Outlook
* Experience working with the following:
* Snowflake development and support
* Advanced SQL knowledge with strong query writing skills
* Object-oriented/object function scripting languages: Python, Java, Scala, etc.
* AWS cloud services: EC2, EMR, RDS, DMS
* Relational databases such as SQL Server and object relational databases such as PostgreSQL
* Data analysis, ETL, and workflow automation
* Multiple ETL/ELT tools and cloud-based data hubs such as Fivetran
* Stream-processing systems: Kafka, Spark-Streaming, etc
* Source code management and deployment tools (e.g., Git, Jenkins, dbt, Docker).
* Infrastructure as Code (IaC) tools (e.g., Terraform, Ansible, CloudFormation)
* Enterprise MDM solutions
LOCATION
* This position is located in St Louis, Missouri and offers a hybrid work schedule. Candidates living in Alabama, Arizona, Florida, Georgia, Illinois, Indiana, Kansas, Kentucky, Michigan, Minnesota, Missouri, New Jersey, N. Carolina, Ohio, Oklahoma, Pennsylvania, Texas and Virginia may also be considered for remote work.
If you need assistance with this application, please contact **************. Please do not contact the office directly - only resumes submitted through this website will be considered.
EyeCare Partners is an equal opportunity/affirmative action employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status.
Auto-ApplyConsultant, Quality Improvement & Data Management
Hutchinson, MN jobs
Hutchinson Health is seeking a skilled Quality Improvement & Data Management Consultant to lead moderate to complex projects aimed at enhancing performance and supporting regional and departmental strategic goals. In this role, you will provide expertise in quality improvement methods, data analysis, change management, and team facilitation within Health Partners, primarily focusing on Hutchinson Health and Olivia Hospital and Clinics. The ideal candidate will have a Bachelor's degree in a relevant field, at least 3 years of healthcare quality improvement experience, and proficiency in Lean, Six Sigma, and PDSA methodologies. In order to be successful in this role, qualified individuals will posses elevated leadership, multi-tasking, technology and self-starting skills. Join us in driving continuous improvement and delivering high-quality care to the Central MN community.
This position will be on-site primarily at Hutchinson Health and Olivia Hospital and Clinics, but will also include time at other Health Partners locations depending on need.
Job Summary:
Provides quality improvement and data expertise acting as a consultant in performance improvement methods, systems thinking, change management, team facilitation, and data collection and analysis. Manages all aspects of mid-sized projects in support of regional or departmental strategic goals. Provides expertise and facilitates development of standardized approaches to create performance improvement plans, define appropriate tools, methodologies and metrics, analyze and interpret data, manage change and facilitate improvement teams. Mentors and coaches individuals and teams in improvement methods, project management, change management, group dynamics and planning methods. Actively partners with leaders to select and implement solutions and develop appropriate monitors and control plans to ensure implementation and hardwiring of improvement/change. Creates and presents project status updates to senior leadership. Identifies and removes barriers to project success or escalates to leadership when appropriate.
Essential Duties and Responsibilities:
Acts as quality consultant, project manager and facilitator for mid-sized to complex projects that support the organization's mission, vision and strategis priorities.
Develops and supports a standardized performance improvement approach to influence the overall Central MN Performance Improvement culture.
Identifies and develops recommendations and material for educational and communication needs in the Quality Performance Improvement department and throughout the Central MN Region.
Establishes appropriate measurement and data monitoring approach to achieve desired results.
Supports local leaders in the identification of data sources/appropriate reports, including serving as a liaison to the HealthPartners system data teams when new report builds are required to evaluate a local improvement initiative.
Prepares charts, tables, and diagrams to assist others in conducting second level analysis and/or in problem-solving.
Partners with the Quality Director and other leaders to design reports and scorecards for local leaders/committees. Assists to ensure that any quality metrics required by accrediting/regulatory bodies (i.e. Joint Commission) are available to appropriate stakeholders.
Performs all other related duties as assigned.
Accountabilities for All Employees:
Adheres to the Hutchinson Health Employee Values.
Maintains confidentiality of the organization and patients.
Reports any health/medical errors.
Observes all Environment of Care policies and reports safety risks or hazards immediately.
Education, Training or Degree Required:
Bachelor degree required (BA/BS), preferably in business, nursing, operations management, industrial engineering, health care, statistics or related disciplines.
3 years of clinical or quality improvement experience in the healthcare industry, Master's level coursework may substitute for years of experience.
Previous project management/quality improvement/data management experience.
License/Registration/Certification: (will be primary source verified by Human Resources)
Green Belt certification, Lean or Six Sigma training and certification, or similar preferred
Experience and Skills: (indicate preferred or required)
Required:
Demonstrated experience in quality improvement methods (Lean, Six Sigma, and PDSA (Plan, Do, Study, Act) processes, A3 thinking), measurement definition and analysis, team facilitation and project management.
Proficiency with Microsoft Office applications including Excel, Word and Power Point and various project management tools to include flow charting.
Knowledge of Joint Commissions (TJC) and Center for Medicare & Medicaid Services (CMS) standards.
Exceptional organizational capabilities and prioritization skills.
Proficient in preparing, leading and facilitating meetings, bringing teams to decisions in facilitating improvement sessions and/or workgroups.
Proficient in tracking and reporting project or initiative progress.
Strong change management, interpersonal communication, and negotiation/conflict management skills.
Preferred:
System thinking/Change management coursework or experience
Experience working in a matrix organization
Experience with Epic
Previous experience in a licensed clinical position helpful
Date created: 10/07/2025 DR/KM
Date updated:
Auto-ApplyData Engineer
Orange, CA jobs
Data Engineer
External Description:
Data Engineer
Alignment Healthcare was founded with a mission to revolutionize health care with a serving heart culture. Through its unique integrated care delivery models, deep physician partnerships and use of proprietary technologies, Alignment is committed to transforming health care one person at a time.
By becoming a part of the Alignment Healthcare team, you will provide members with the quality of care they truly need and deserve. We believe that great work comes from people who are inspired to be their best. We have built a team of talented and experienced people who are passionate about transforming the lives of the seniors we serve. In this fast-growing company, you will find ample room for growth and innovation alongside the Alignment community.
Position Summary:
Alignment Healthcare is a data and technology driven healthcare company focused partnering with health systems, health plans and provider groups to provide care delivery that is preventive, convenient, coordinated, and that results in improved clinical outcomes for seniors.
We are experiencing rapid growth (backed by top private equity firms), our Data Services and BI team is looking for the best and brightest leaders. Data drives the way we make decisions. We love our customers and understanding them better makes it possible to provide the best clinical outcome and care experience.
This position will play a key role in building and operating a cloud-based data platform and its pipelines using big data technologies.
As a Data Engineer, you will develop a new data engineering platform that leverage a new cloud architecture, and will extend or migrate our existing data pipelines to this architecture as needed. You will also be assisting with integrating the SQL data warehouse platform as our primary processing platform to create the curated enterprise data model for the company to leverage. You will be part of a team building the next generation data platform and to drive the adoption of new technologies and new practices in existing implementations. You will be responsible for designing and implementing the complex ETL pipelines in cloud data platform and other solutions to support the rapidly growing and dynamic business demand for data, and use it to deliver the data as service which will have an immediate influence on day-to-day decision making.
General Duties/Responsibilities:
(May include but are not limited to)
Interfacing with business customers, gathering requirements and developing new datasets in data platform
Building and migrating the complex ETL/EDI pipelines from on premise system to cloud and Hadoop/Spark to make the system grow elastically
Identifying the data quality issues to address them immediately to provide great user experience
Extracting and combining data from various heterogeneous data sources
Designing, implementing and supporting a platform that can provide ad-hoc access to large datasets
Modelling data and metadata to support machine learning and AI
Minimum Requirements:
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Minimum Experience:
3+ years relevant experience in cloud based data engineering.
Demonstrated ability in data modeling, ETL development, EDI Development and data warehousing.
Data Warehousing Experience with SQL Server, Oracle, Redshift, Teradata, etc.
Experience with Big Data Technologies (NoSQL databases, Hadoop, Hive, Hbase, Pig, Spark, Elasticsearch etc.)
Experience in using Python, .net, Java and/or other data engineering languages
Knowledge and experience of SQL Server and SSIS.
Experience with X12(837,834,278) and HL7 transactions
Education/Licensure:
Bachelors or Masters in Computer Science, Engineering, Mathematics, Statistics, or related field
Other:
Excellent communication, analytical and collaborative problem-solving skills
Requires Spark and Scala
Building and migrating the complex ETL/EDI pipelines from on premise system to cloud and Hadoop/Spark to make the system grow elastically
Preferred
Healthcare domain and data experience
Healthcare EDI experience is a plus
API development experience is a plus
Industry experience as a Data Engineer or related specialty (e.g., Software Engineer, Business Intelligence Engineer, Data Scientist) with a track record of manipulating, processing, and extracting value from large datasets.
Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
Experience building data products incrementally and integrating and managing datasets from multiple sources
Experience leading large-scale data warehousing and analytics projects, including using Azure or AWS technologies - SQL Server, Redshift, S3, EC2, Data-pipeline, Data Lake, Data Factory and other big data technologies
Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space
Linux/UNIX including to process large data sets.
Experience with Azure, AWS or GCP is a plus
Microsoft Azure Certification is a plus
Demonstrable track record dealing well with ambiguity, prioritizing needs, and delivering results in an agile, dynamic startup environment
Problem solving skills and Ability to meet deadlines are a must
Microsoft Azure Certification is a plus
Python and Java are a plus,
Experience with X12(837,834,278) and HL7 transactions preferred
Work Environment
The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Essential Physical Functions:
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
While performing the duties of this job, the employee is regularly required to talk or hear. The employee regularly is required to stand, walk, sit, use hand to finger, handle or feel objects, tools, or controls; and reach with hands and arms.
The employee frequently lifts and/or moves up to 10 pounds. Specific vision abilities required by this job include close vision and the ability to adjust focus.
City: Orange
State: California
Location City: Orange
Schedule: Full Time
Location State: California
Community / Marketing Title: Data Engineer
Company Profile:
Alignment Healthcare was founded with a mission to revolutionize health care with a serving heart culture. Through its unique integrated care delivery models, deep physician partnerships and use of proprietary technologies, Alignment is committed to transforming health care one person at a time.
By becoming a part of the Alignment Healthcare team, you will provide members with the quality of care they truly need and deserve. We believe that great work comes from people who are inspired to be their best. We have built a team of talented and experienced people who are passionate about transforming the lives of the seniors we serve. In this fast-growing company, you will find ample room for growth and innovation alongside the Alignment community.
EEO Employer Verbiage:
On August 17, 2021, Alignment implemented a policy requiring all new hires to receive the COVID-19 vaccine. Proof of vaccination will be required as a condition of employment subject to applicable laws concerning exemptions/accommodations. This policy is part of Alignment's ongoing efforts to ensure the safety and well-being of our staff and community, and to support public health efforts. Alignment Healthcare, LLC is proud to practice Equal Employment Opportunity and Affirmative Action. We are looking for diversity in qualified candidates for employment: Minority/Female/Disable/Protected Veteran. If you require any reasonable accommodation under the Americans with Disabilities Act (ADA) in completing the online application, interviewing, completing any pre-employment testing or otherwise participating in the employee selection process, please contact ******************.
Easy ApplyData Engineer
Orange, CA jobs
Data Engineer
External Description:
Data Engineer
Alignment Healthcare was founded with a mission to revolutionize health care with a serving heart culture. Through its unique integrated care delivery models, deep physician partnerships and use of proprietary technologies, Alignment is committed to transforming health care one person at a time.
By becoming a part of the Alignment Healthcare team, you will provide members with the quality of care they truly need and deserve. We believe that great work comes from people who are inspired to be their best. We have built a team of talented and experienced people who are passionate about transforming the lives of the seniors we serve. In this fast-growing company, you will find ample room for growth and innovation alongside the Alignment community.
Position Summary:
Alignment Healthcare is a data and technology driven healthcare company focused partnering with health systems, health plans and provider groups to provide care delivery that is preventive, convenient, coordinated, and that results in improved clinical outcomes for seniors.
We are experiencing rapid growth (backed by top private equity firms), our Data Services and BI team is looking for the best and brightest leaders. Data drives the way we make decisions. We love our customers and understanding them better makes it possible to provide the best clinical outcome and care experience.
This position will play a key role in building and operating a cloud-based data platform and its pipelines using big data technologies.
As a Data Engineer, you will develop a new data engineering platform that leverage a new cloud architecture, and will extend or migrate our existing data pipelines to this architecture as needed. You will also be assisting with integrating the SQL data warehouse platform as our primary processing platform to create the curated enterprise data model for the company to leverage. You will be part of a team building the next generation data platform and to drive the adoption of new technologies and new practices in existing implementations. You will be responsible for designing and implementing the complex ETL pipelines in cloud data platform and other solutions to support the rapidly growing and dynamic business demand for data, and use it to deliver the data as service which will have an immediate influence on day-to-day decision making.
General Duties/Responsibilities:
(May include but are not limited to)
Interfacing with business customers, gathering requirements and developing new datasets in data platform
Building and migrating the complex ETL/EDI pipelines from on premise system to cloud and Hadoop/Spark to make the system grow elastically
Identifying the data quality issues to address them immediately to provide great user experience
Extracting and combining data from various heterogeneous data sources
Designing, implementing and supporting a platform that can provide ad-hoc access to large datasets
Modelling data and metadata to support machine learning and AI
Minimum Requirements:
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Minimum Experience:
3+ years relevant experience in cloud based data engineering.
Demonstrated ability in data modeling, ETL development, EDI Development and data warehousing.
Data Warehousing Experience with SQL Server, Oracle, Redshift, Teradata, etc.
Experience with Big Data Technologies (NoSQL databases, Hadoop, Hive, Hbase, Pig, Spark, Elasticsearch etc.)
Experience in using Python, .net, Java and/or other data engineering languages
Knowledge and experience of SQL Server and SSIS.
Experience with X12(837,834,278) and HL7 transactions
Education/Licensure:
Bachelors or Masters in Computer Science, Engineering, Mathematics, Statistics, or related field
Other:
Excellent communication, analytical and collaborative problem-solving skills
Requires Spark and Scala
Building and migrating the complex ETL/EDI pipelines from on premise system to cloud and Hadoop/Spark to make the system grow elastically
Preferred
Healthcare domain and data experience
Healthcare EDI experience is a plus
API development experience is a plus
Industry experience as a Data Engineer or related specialty (e.g., Software Engineer, Business Intelligence Engineer, Data Scientist) with a track record of manipulating, processing, and extracting value from large datasets.
Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
Experience building data products incrementally and integrating and managing datasets from multiple sources
Experience leading large-scale data warehousing and analytics projects, including using Azure or AWS technologies - SQL Server, Redshift, S3, EC2, Data-pipeline, Data Lake, Data Factory and other big data technologies
Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space
Linux/UNIX including to process large data sets.
Experience with Azure, AWS or GCP is a plus
Microsoft Azure Certification is a plus
Demonstrable track record dealing well with ambiguity, prioritizing needs, and delivering results in an agile, dynamic startup environment
Problem solving skills and Ability to meet deadlines are a must
Microsoft Azure Certification is a plus
Python and Java are a plus,
Experience with X12(837,834,278) and HL7 transactions preferred
Work Environment
The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Essential Physical Functions:
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
While performing the duties of this job, the employee is regularly required to talk or hear. The employee regularly is required to stand, walk, sit, use hand to finger, handle or feel objects, tools, or controls; and reach with hands and arms.
The employee frequently lifts and/or moves up to 10 pounds. Specific vision abilities required by this job include close vision and the ability to adjust focus.
City: Orange
State: California
Location City: Orange
Schedule: Full Time
Location State: California
Community / Marketing Title: Data Engineer
Company Profile:
Alignment Healthcare was founded with a mission to revolutionize health care with a serving heart culture. Through its unique integrated care delivery models, deep physician partnerships and use of proprietary technologies, Alignment is committed to transforming health care one person at a time.
By becoming a part of the Alignment Healthcare team, you will provide members with the quality of care they truly need and deserve. We believe that great work comes from people who are inspired to be their best. We have built a team of talented and experienced people who are passionate about transforming the lives of the seniors we serve. In this fast-growing company, you will find ample room for growth and innovation alongside the Alignment community.
EEO Employer Verbiage:
On August 17, 2021, Alignment implemented a policy requiring all new hires to receive the COVID-19 vaccine. Proof of vaccination will be required as a condition of employment subject to applicable laws concerning exemptions/accommodations. This policy is part of Alignment's ongoing efforts to ensure the safety and well-being of our staff and community, and to support public health efforts. Alignment Healthcare, LLC is proud to practice Equal Employment Opportunity and Affirmative Action. We are looking for diversity in qualified candidates for employment: Minority/Female/Disable/Protected Veteran. If you require any reasonable accommodation under the Americans with Disabilities Act (ADA) in completing the online application, interviewing, completing any pre-employment testing or otherwise participating in the employee selection process, please contact ******************.
Easy ApplyData Engineer Architect
Anaheim, CA jobs
Architect, Data Engineering
External Description:
Architect, Data Engineering
Alignment Healthcare was founded with a mission to revolutionize health care with a serving heart culture. Through its unique integrated care delivery models, deep physician partnerships and use of proprietary technologies, Alignment is committed to transforming health care one person at a time.
By becoming a part of the Alignment Healthcare team, you will provide members with the quality of care they truly need and deserve. We believe that great work comes from people who are inspired to be their best. We have built a team of talented and experienced people who are passionate about transforming the lives of the seniors we serve. In this fast-growing company, you will find ample room for growth and innovation alongside the Alignment community.
Position Summary:
Alignment Healthcare is a data and technology driven healthcare company focused partnering with health systems, health plans and provider groups to provide care delivery that is preventive, convenient, coordinated, and that results in improved clinical outcomes for seniors.
We are experiencing rapid growth (backed by top private equity firms), our Data and Analytics(D&A) team is looking for the best and brightest leaders. Data drives the way we make decisions. We love our customers and understanding them better makes it possible to provide the best clinical outcome and care experience.
As Architect of Data Engineering, you will play a key role in architecting data engineering solutions and technology to uncover deep insights from data using big data technologies and advanced statistical analysis, processing very large data sets using cloud-based data pipelines(real-time and batch) on Microsoft Azure, variety of analytic tools, visualizations and delivering actionable healthcare insights & solutions. You will also be assisting with integrating the SQL data warehouse platform as our primary processing platform to create the curated enterprise data model for the company to leverage.
General Duties/Responsibilities:
(May include but are not limited to)
Architects, develops, and implements Data Engineering strategy to support organizational initiatives.
Leads the architecture function of Data Engineering (Pipeline) team
Provides leadership, mentorship to internal data Engineering staff.
Lead the effort for building and migrating the complex ETL/EDI pipelines from on premise system to cloud and Hadoop/Spark/Kafka to make the system grow elastically
Ensuring data quality throughout all stages of acquisition and processing, including data collection, ground truth generation, normalization and transformation
Extracting and combining data from various heterogeneous data sources
Designing, implementing and supporting a platform that can provide ad-hoc access to large datasets
Modelling data and metadata to support machine learning and AI
Build statistical models, apply machine learning techniques, analyze very large data sets and construct metrics using these modeling techniques.
Guiding critical business decisions by highlighting opportunities, identifying correlations, defining experiments and figuring out cause and effect relationships.
Deploying models in operational software to power insights and actions.
Develop experimental and analytic plans for data modeling processes, use of strong baselines, ability to accurately determine cause and effect relations
Provide leadership to Data Engineering teams.
Communicate with business partners, software vendors, and internal departments.
Participate in project meetings, providing input to project plans and providing status updates.
Responsible for the global standardization of Data Engineering processes and process improvement and efficiency
Minimum Requirements:
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Minimum Experience:
6+ years relevant experience in Data Engineering.
3+ years of architecture experience
Experience with machine learning algorithms, including deep neural networks, natural language processing, kernel methods, dimensionality reduction, ensemble methods, hidden Markov models and graph algorithms.
Data Warehousing Experience with SQL Server, Oracle, Redshift, Teradata, etc.
Experience with Big Data Technologies (NoSQL databases, Hadoop, Hive, Hbase, Pig, Spark, Elasticsearch, Databricks etc.)
Experience with real-time data processing and API platforms.
Experience in using Python, Java and/or other data engineering languages
Experience with data visualization and presentation, turning complex analysis into insight.
Healthcare domain and data experience
Education/Licensure:
Masters in Computer Science, Engineering, Mathematics, Statistics, or related field
Other:
Demonstrated ability in data modeling, ETL development, API developments and data warehousing.
Knowledge and experience of SQL and SSIS.
Excellent communication, analytical and collaborative problem-solving skills
Preferred:
PhD or Master's in computer science, Electrical Engineering, Mathematics, Statistics, or related field
Industry experience as a Data Engineering Leader or related specialty (e.g., Software Engineer, Business Intelligence Engineer, Data Engineer) with a track record of manipulating, processing, and extracting value from large datasets.
Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
Experience building data products incrementally and integrating and managing datasets from multiple sources
Experience leading large-scale data warehousing and analytics projects, including using Azure or AWS technologies - SQL Server, Redshift, S3, EC2, Data-pipeline, Data Lake, Data Factory and other big data technologies
Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space
Linux/UNIX including to process large data sets.
Experience with Azure, AWS or GCP is a plus
Microsoft Azure data architecture Certification is a plus
Demonstrable track record dealing well with ambiguity, prioritizing needs, and delivering results in an agile, dynamic startup environment
Knowledge of scripting for automation (e.g. Python, Perl, Scala etc.)
Experience with NoSQL, Spark, Hadoop, Elasticsearch etc.
Demonstrable track record dealing well with ambiguity, prioritizing needs, and delivering results in an agile, dynamic startup environment
Work Environment
The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Essential Physical Functions:
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
While performing the duties of this job, the employee is regularly required to talk or hear. The employee regularly is required to stand, walk, sit, use hand to finger, handle or feel objects, tools, or controls; and reach with hands and arms.
The employee frequently lifts and/or moves up to 10 pounds. Specific vision abilities required by this job include close vision and the ability to adjust focus.
City: Anaheim
State: California
Location City: Anaheim
Schedule: Full Time
Location State: California
Community / Marketing Title: Data Engineer Architect
Company Profile:
Alignment Healthcare was founded with a mission to revolutionize health care with a serving heart culture. Through its unique integrated care delivery models, deep physician partnerships and use of proprietary technologies, Alignment is committed to transforming health care one person at a time.
By becoming a part of the Alignment Healthcare team, you will provide members with the quality of care they truly need and deserve. We believe that great work comes from people who are inspired to be their best. We have built a team of talented and experienced people who are passionate about transforming the lives of the seniors we serve. In this fast-growing company, you will find ample room for growth and innovation alongside the Alignment community.
EEO Employer Verbiage:
On August 17, 2021, Alignment implemented a policy requiring all new hires to receive the COVID-19 vaccine. Proof of vaccination will be required as a condition of employment subject to applicable laws concerning exemptions/accommodations. This policy is part of Alignment's ongoing efforts to ensure the safety and well-being of our staff and community, and to support public health efforts. Alignment Healthcare, LLC is proud to practice Equal Employment Opportunity and Affirmative Action. We are looking for diversity in qualified candidates for employment: Minority/Female/Disable/Protected Veteran. If you require any reasonable accommodation under the Americans with Disabilities Act (ADA) in completing the online application, interviewing, completing any pre-employment testing or otherwise participating in the employee selection process, please contact ******************.
Easy ApplyData Engineer Principal
Ann, MN jobs
HealthPartners is hiring a Data Engineer Principal. Our mission is to provide simple and affordable healthcare. HealthPartners teams use data to improve patient and member experience, improve health, and reduce the per capita cost of health care. We are seeking a dynamic and technically skilled data engineer to lead the development and delivery of training programs that empower our 275-person department-including 21 leaders-to thrive in a modern data environment. This role bridges the gap between technical complexity and team capability, ensuring our workforce is confident and competent in using tools like Databricks and Python, while also understanding data governance principles. HealthPartners data engineers are responsible for building, managing and optimizing data pipelines that facilitate data movement in service of these goals by implementing and testing methods (or build systems) that improve data reliability and quality. They champion and embrace leading practices in the field, and develop processes to effectively store, manage and deliver data.
ACCOUNTABILITIES:
* All team members must champion and model our values of partnership, curiosity, compassion, integrity, and excellence, and must contribute to a culture of continuous learning.
* Work with stakeholders, data scientists and analysts to frame problems, clean and integrate data, and determine the best way to provision that data on demand
* Collaborate with other engineers to design technology solutions that achieve measurable results at scale
* Provide technical thought leadership for efficient data pipeline processes, warehouse architecture and business intelligence functions.
* Assist in the development of short, medium, and long-term plans to achieve strategic objectives.
* Guide, mentor, influence, and adopt a cloud-first modern data architectural direction, and consistently adopt the associated standards and best practices
* Develop, maintain, and distribute up-to-date training documentation and learning resources.
* Design and facilitate engaging learning labs, combining presentations with hands-on exercises.
* Create and deliver advanced training tailored for leadership, enabling them to support their teams effectively in a data-driven environment.
* Act as a technical translator and coach, helping team members understand and apply concepts related to data lakes, data governance, and data engineering.
* Collaborate with SMEs, engineers, and leadership to identify learning needs and close knowledge gaps.
* Serve as a solutions architect when needed, guiding technical setup and best practices in the absence of deep technical leadership.
* Champion a culture of continuous learning and technical enablement across the department.
REQUIRED SKILLS/ QUALIFICATIONS:
Bachelor's degree in computer science, data or social science, operations research, statistics, applied mathematics, econometrics, or a related quantitative field AND 4+ years of experience in business analytics, data science, software development, data modeling and/or data engineering work OR Master's Degree in Computer Science, Math, Software is acceptable
* 10+ years of programming experience with command in at least two or more of the following programming languages: SQL, Python, Java, R or Spark
* Expert proficiency in SQL; experience with Oracle, PostgreSQL, MySQL, or Microsoft SQL Server
* 10+ years of experience using a combination of tools such as Azure Data Factory, Synapse, Data Explorer, App Insights, Power BI, and Databricks
* 3+ years of experience using Azure CosmosDB and Azure Data Lake Storage
* Must be motivated, self-driven, curious, and creative
* Proven ability to lead projects and teams with minimal guidance
* Must be skilled communicator, and demonstrate an ability to work with end users and business leaders
* Demonstrate the ability to support and complement the work of a diverse development and/or operations team
* Lead a collaborative, cross-functional team, Active participation in sprint reviews
* Expert and a minimum of five years' experience in the following domains: Data management, software engineering, I&O (infrastructure and operations)
PREFERRED QUALIFICATIONS:
* Knowledge of health care operations
* Exposure to agile/scrum
* Ability to work in a hybrid cloud environment consisting of on premise and public cloud infrastructure. An ideal candidate will have experience with one or more of the following skill sets
* Expert in Relational databases like Oracle, SQL server
* Expert in Optimizing and tuning SQL/Oracle queries, stored procedures and triggers
* Expert in Python (numpy, pandas, matplotlib etc) and Jupyter notebooks for exploratory data analysis, machine learning, and process automation
* Expert in areas of CI/CD, Continuous testing, and site reliability engineering.
* Expert in Microsoft Azure applications such as Azure Data Factory, Synapse, Purview, Databricks /Spark, Power BI, PowerApps.
* Expert in event streaming tools like NiFi, Kafka and Flink
* Expert in data processing tools, like Apache's Sqoop, Spark and Hive
* Expert in Document or NoSQL datastores, particularly MongoDB
* Expert in Power BI data models using advanced Power Query and DAX
* Expert in in AI/ML Ops
* Interest and desire to contribute to emerging practices around DataOps (CI/CD, IaC, configuration management, etc.)
* Collaborate effectively with product management, program management, engineers, and stakeholders.
* Excellent analytical and critical thinking skills
* Ability to influence without authority and thrive in an ambiguous environment.
Auto-ApplyData Engineer
Chicago, IL jobs
Looking for more than just an assignment? We're looking for you! This isn't just another assignment, but a real opportunity and a challenge for the right person. LRS Consulting Services is seeking a Data Engineer for a Direct Hire opportunity with our client in downtown Chicago, IL! LRS Consulting Services has been delivering the highest quality consultants to our clients since 1979. We've built a solid reputation for dealing with our clients and our consultants with honesty, integrity, and respect. We work hard every day to maintain that reputation, and we're very interested in candidates who can help us. If you're that candidate, this opportunity is made for you!
Job Title: Data Engineer
Location: Chicago, IL (4 days on-site)
We are seeking an experienced candidate with a strong background in data engineering to oversee and elevate our data architecture, database management, data quality, and governance practices. This role requires a highly technical professional with a strategic mindset who can manage complex database environments, lead data governance initiatives, streamline technical debt, and integrate multiple data systems.
You will play a critical role in shaping the future of our data infrastructure, ensuring the accuracy, integrity, and usability of our core data assets. This is an excellent opportunity for a technically proficient and strategically minded database professional to lead impactful initiatives and work across multiple data domains.
Primary Responsibilities:
Design, build, and maintain scalable data pipelines and ETL/ELT processes across various platforms (including SQL Server, Fabric, Databricks, Salesforce) to support analytics and operational needs.
Lead data cleanup initiatives with a focus on maintaining and improving data quality, including establishing data governance frameworks and guardrails and prioritizing historical data cleanup efforts.
Support master data management initiatives by enforcing data standards and consistency across data sources.
Develop and own database schemas across a variety of platforms. (incl. SQL Server, Fabric, Databricks, Salesforce).
Create and maintain comprehensive data dictionaries and metadata documentation that clearly define fields, data types, relationships, and business context
Collaborate with stakeholders across business enablement and front office functions (i.e., bankers, valuation experts) to gather requirements and define roadmap.
Provide strategic guidance on database design to ensure scalability, efficiency, and alignment with business objectives.
Identify and address technical debt in data infrastructure by modernizing the tech stack and applying best practices for performance, scalability, and maintainability.
Push the firm to stay on top of tech trends by experimenting and learning new technologies, participating in internal & external technology communities, and mentoring other members of the data team
Collaborate with development teams to maintain, enhance, and optimize tools for database management and operational efficiency.
Qualifications:
10+ years of professional experience in data engineering, database administration, or related roles with a focus on building scalable data solutions.
Proven expertise in designing, implementing, and optimizing data pipelines, data warehouses, and data integration processes..
Experience integrating disparate data systems and managing complex environments with multiple data sources and databases.
Experience with modern data platforms and technologies such as Fabric, Databricks, Snowflake, or similar cloud-based data solutions.
Expert-level SQL relational database design, including performance tuning, schema design, and database optimization
Proficiency in Python or other scripting languages for data automation, transformation, and workflow orchestration.
Proven ability to work with structured, semi-structured, and unstructured data.
Solid understanding of master data management (MDM) concepts and practical experience implementing data governance frameworks.
Demonstrated ability to collaborate effectively across technical teams, including application developers, IT infrastructure, and data consumers.
Ability to maintain thorough data documentation and create scalable, maintainable database environments.
Strong skills in maintaining detailed metadata, data dictionaries, and documentation to ensure data assets are understandable, accurate, and maintainable.
Commitment to best practices for data quality, scalability, and maintainability in a fast-moving technical environment.
Ideal candidate would possess exposure to the financial industry, preferably private credit & equity markets, and related data platforms (e.g., FactSet, Bloomberg, CapIQ, Pitchbook)
The base range for this salaried position is $150,000 - $175,000, depending on experience; pay ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hires of this position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training.
LRS is an equal opportunity employer. Applicants for employment will receive consideration without unlawful discrimination based on race, color, religion, creed, national origin, sex, age, disability, marital status, gender identity, domestic partner status, sexual orientation, genetic information, citizenship, status or protected veteran status.
#LI-DS1
Data & BI Engineer
Springfield, MO jobs
LRS is seeking a skilled Data & BI Engineer to design, build, and maintain scalable data solutions that power analytics and reporting across the organization. This hybrid role combines data architecture and engineering (integration, pipelines, modeling) with BI development (dashboards, visualizations, and insights). The ideal candidate is comfortable working across the full data stack and collaborating with business stakeholders to deliver actionable intelligence.
This is an in-office position based out of our headquarters in Springfield, Illinois.
Requirements
5+ years of experience in data engineering and data analytics.
Proficiency in SQL Server and T-SQL.
Experience with data modeling (star/snowflake schemas) and ETL/ELT processes.
Experience with BI tools.
Strong understanding of data governance, security, and performance optimization.
Excellent communication and stakeholder engagement skills.
The following will make you a stronger candidate
Experience working with data warehouses.
Experience with Power BI, DAX, and Power Query.
Familiarity with Microsoft Fabric.
Key Responsibilities
Design and implement data pipelines using modern ETL/ELT tools.
Develop and maintain semantic data models optimized for reporting and analytics.
Build compelling Power BI dashboards and reports with functional, user-friendly visuals.
Collaborate with business units to understand data needs and translate them into technical solutions.
Ensure data quality, integrity, and governance across systems.
Optimize performance of data solutions and BI assets.
Support data integration across cloud and on-premises systems.
Document architecture, data flows, and reporting logic.
Success Factors
The successful candidate will demonstrate expertise across the data stack, delivering reliable, high-quality data solutions and actionable insights. Success in this role will be measured by your ability to collaborate with business stakeholders, optimize data-driven processes, and drive impactful analytics initiatives.
Organization Structure
The LRS IT team consists of a Chief Information Officer, Director of IT, Director of Applications, Director of Information Security, and teams for networking, infrastructure, cloud, communications, end-user services, and applications. The team is based in Springfield, IL and manages the global operations at LRS. You will report to the Chief Information Officer.
LRS is an equal opportunity employer. Applicants for employment will receive consideration without unlawful discrimination based on race, color, religion, creed, national origin, sex, age, disability, marital status, gender identity, domestic partner status, sexual orientation, genetic information, citizenship status or protected veteran status.
Salary Range: $90,000-$130,000. This salary range represents the low and high end for this position. The salary will vary depending on factors including experience and skills. The range listed is just one component of LRS' total employee compensation, as we have a generous benefits package.
Corporate Data - Data Engineer 135-2017
Tulsa, OK jobs
The Data Engineer will be responsible for expanding, optimizing and monitoring our data and data pipeline architecture, as well as optimizing data flow and collection across organizational teams. The Data Engineer will support our software engineers, database architects and data analysts on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
KEY RESPONSIBILITIES:
Create and maintain optimal data pipeline architecture to support our next generation of products and data initiatives.
Assemble large, complex data sets that meet functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
Experience in the development of SSIS, ETL and other standardized data management tools.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Performs other duties as required.
QUALIFICATIONS:
Build processes supporting data transformation, data structures, metadata, dependency and workload management.
Strong project management and organizational skills.
Ability to work independently, handle multiple tasks and projects simultaneously.
Successful completion of Health Care Sanctions background check.
EDUCATION/EXPERIENCE:
College degree or equivalent experience required.
Project management skills preferred.
Willingness to work in a high-tech, continually evolving, innovative environment.
Data Platform Engineer
Brentwood, TN jobs
Data Platform Engineer The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions.
Responsibilities
* Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms.
* Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks.
* Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark.
* Develop and manage data models, data warehousing solutions, and data integration architectures in Azure.
* Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems.
* Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration.
* Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases.
* Ensure data quality, governance, and security across the data lifecycle.
* Collaborate with product managers by estimating technical tasks and deliverables.
* Uphold the mission and values of Monogram Health in all aspects of your role and activities.
Position Requirements
* A bachelor's degree in computer science, data science, software engineering or related field.
* Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark.
* Expert level knowledge of Python or other scripting languages required.
* Proficiency in SQL and other data query languages.
* Understanding of data modeling and schema design principles
* Ability to work with large datasets and perform data analysis
* Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable.
* Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC).
* Thorough understanding of Azure Cloud Infrastructure offerings.
* Demonstrated problem-solving and troubleshooting skills.
* Team player with demonstrated written and communication skills.
Benefits
* Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts
* Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources
* Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave
* Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts
Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders.
Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home.
Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
Data Platform Engineer
Brentwood, TN jobs
Job DescriptionPosition:
Data Platform Engineer
The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions.
Responsibilities
Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms.
Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks.
Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark.
Develop and manage data models, data warehousing solutions, and data integration architectures in Azure.
Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems.
Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration.
Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases.
Ensure data quality, governance, and security across the data lifecycle.
Collaborate with product managers by estimating technical tasks and deliverables.
Uphold the mission and values of Monogram Health in all aspects of your role and activities.
Position Requirements
A bachelor's degree in computer science, data science, software engineering or related field.
Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark.
Expert level knowledge of Python or other scripting languages required.
Proficiency in SQL and other data query languages.
Understanding of data modeling and schema design principles
Ability to work with large datasets and perform data analysis
Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable.
Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC).
Thorough understanding of Azure Cloud Infrastructure offerings.
Demonstrated problem-solving and troubleshooting skills.
Team player with demonstrated written and communication skills.
Benefits
Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts
Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources
Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave
Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts
Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders.
Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home.
Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
Data Scientist
Berkeley, CA jobs
LifeLong Medical Care has an exciting opportunity for a Data Scientist to provide programming support to build analytic applications to support business decision making in the organization. This is a part time, 30 hour/week, benefit eligible position.
LifeLong Medical Care is a multi-site, Federally Qualified Health Center (FQHC) with a rich history of providing innovative healthcare and social services to a wonderfully diverse patient community. Our patient-centered health home is a dynamic place to work, practice, and grow. We have over 15 primary care health centers and deliver integrated services including psychosocial, referrals, chronic disease management, dental, health education, home visits, and much, much more.
Benefits
Compensation: $71k - $75k/year. We offer excellent benefits including: medical, dental, vision (including dependent and domestic partner coverage), generous leave benefits including ten paid holidays, Flexible Spending Accounts, 403(b) retirement savings plan.
Responsibilities
* Under the supervision of the Manager of Analytics, the data scientist is a senior and key part of data analytic team, developing data insights through reporting and provides assistance to all data reporting tool users in Lifelong Medical Care, including documentation of report requirements and report implementations.
* The senior analyst is the core content expert for designated subjects as assigned by Manager of Analytics or designee
* Maintains integrity of the data warehouse in their content areas or as assigned
* Develops and maintains internal reporting services platform using SSRS and Tableau. Supports Data Analysts and Junior Analysts in report development.
* Provides analytic support and data insights to one or multiple departments and develops a variety of complex ad hoc, production and/or trend reports to support business decisions and operational processes for internal and external clients.
* Collaboratively develops data strategy for core content area
* Arranges project requirements in programming sequence by analyzing requirements; preparing a work flow chart and diagram using knowledge of computer capabilities, subject matter, programming language, and logic.
* Communicates with clients and key stakeholders to develop and create specification analytical applications.
* Develops and maintains applications and databases by evaluating client needs; analyzing requirements; developing software systems.
* Performs additional duties in support of the team and immediate reporting need of other departments as assigned by supervisor.
* Protects operations by keeping information confidential and complies with HIPAA requirements.
Qualifications
* Commitment to the provision of primary care services for the underserved with demonstrated ability and sensitivity in working with a variety of people from low-income populations, with diverse educational, lifestyle, ethnic and cultural origins.
* Be creative and mature with a "can do," proactive attitude.
* Ability to effectively support, motivate and supervise staff, encourage and nurture development and growth, to build a strong and productive team.
* Strong organizational, administrative, multi-tasking, prioritization and problem-solving skills.
* Ability to work effectively under pressure in a positive, friendly manner and to be flexible and adaptive to change.
* Ability to take initiative, work independently and make sound judgments within established guidelines; understand and apply oral and written instructions; establish and maintain effective working relations with staff, clinical providers, managers and external agencies or organizations.
* Excellent interpersonal, verbal, and written skills and ability to effectively work with people from diverse backgrounds and be culturally sensitive.
* Work in a team-oriented environment with a number of professionals with different work styles and support needs.
* Conduct oneself in internal and external settings in a way that reflects positively on LifeLong Medical Care as an organization of professional, confident and sensitive staff.
* Ability to continuously scan the environment, identifying opportunities for improvement and intersections with other departments of LifeLong Medical Care and partner organizations.
Job Requirements
* Bachelor's degree (Masters preferred) in Computer Science or a related field or an equivalent combination of education and/or experience.
* Minimum 10 years of experience in programming and data analysis involving duties listed above.
* Experience in Healthcare related field and/or data reporting related work and data visualization development
* Excellent skills in SQL scripting and knowledge of database development.
* Basic understanding of SSIS
* Proficiency in Microsoft Offices, including Excel, PowerPoint, Word.
Job Preferences
* Community Health Center experience.
* Microsoft Certified Solution Associate (MCSA) in SQL database development.
Auto-ApplyData Scientist
Berkeley, CA jobs
LifeLong Medical Care has an exciting opportunity for a Data Scientist to provide programming support to build analytic applications to support business decision making in the organization.
This is a part time, 30 hour/week, benefit eligible position.
LifeLong Medical Care is a multi-site, Federally Qualified Health Center (FQHC) with a rich history of providing innovative healthcare and social services to a wonderfully diverse patient community. Our patient-centered health home is a dynamic place to work, practice, and grow. We have over 15 primary care health centers and deliver integrated services including psychosocial, referrals, chronic disease management, dental, health education, home visits, and much, much more.
Benefits
Compensation: $71k - $75k/year. We offer excellent benefits including: medical, dental, vision (including dependent and domestic partner coverage), generous leave benefits including ten paid holidays, Flexible Spending Accounts, 403(b) retirement savings plan.
Responsibilities
Under the supervision of the Manager of Analytics, the data scientist is a senior and key part of data analytic team, developing data insights through reporting and provides assistance to all data reporting tool users in Lifelong Medical Care, including documentation of report requirements and report implementations.
The senior analyst is the core content expert for designated subjects as assigned by Manager of Analytics or designee
Maintains integrity of the data warehouse in their content areas or as assigned
Develops and maintains internal reporting services platform using SSRS and Tableau. Supports Data Analysts and Junior Analysts in report development.
Provides analytic support and data insights to one or multiple departments and develops a variety of complex ad hoc, production and/or trend reports to support business decisions and operational processes for internal and external clients.
Collaboratively develops data strategy for core content area
Arranges project requirements in programming sequence by analyzing requirements; preparing a work flow chart and diagram using knowledge of computer capabilities, subject matter, programming language, and logic.
Communicates with clients and key stakeholders to develop and create specification analytical applications.
Develops and maintains applications and databases by evaluating client needs; analyzing requirements; developing software systems.
Performs additional duties in support of the team and immediate reporting need of other departments as assigned by supervisor.
Protects operations by keeping information confidential and complies with HIPAA requirements.
Qualifications
Commitment to the provision of primary care services for the underserved with demonstrated ability and sensitivity in working with a variety of people from low-income populations, with diverse educational, lifestyle, ethnic and cultural origins.
Be creative and mature with a “can do,” proactive attitude.
Ability to effectively support, motivate and supervise staff, encourage and nurture development and growth, to build a strong and productive team.
Strong organizational, administrative, multi-tasking, prioritization and problem-solving skills.
Ability to work effectively under pressure in a positive, friendly manner and to be flexible and adaptive to change.
Ability to take initiative, work independently and make sound judgments within established guidelines; understand and apply oral and written instructions; establish and maintain effective working relations with staff, clinical providers, managers and external agencies or organizations.
Excellent interpersonal, verbal, and written skills and ability to effectively work with people from diverse backgrounds and be culturally sensitive.
Work in a team-oriented environment with a number of professionals with different work styles and support needs.
Conduct oneself in internal and external settings in a way that reflects positively on LifeLong Medical Care as an organization of professional, confident and sensitive staff.
Ability to continuously scan the environment, identifying opportunities for improvement and intersections with other departments of LifeLong Medical Care and partner organizations.
Job Requirements
Bachelor's degree (Masters preferred) in Computer Science or a related field or an equivalent combination of education and/or experience.
Minimum 10 years of experience in programming and data analysis involving duties listed above.
Experience in Healthcare related field and/or data reporting related work and data visualization development
Excellent skills in SQL scripting and knowledge of database development.
Basic understanding of SSIS
Proficiency in Microsoft Offices, including Excel, PowerPoint, Word.
Job Preferences
Community Health Center experience.
Microsoft Certified Solution Associate (MCSA) in SQL database development.
Auto-ApplyData Scientist
New Jersey jobs
The primary purpose of this position is to serve as the data scientist with a split portfolio between the Atlantic City office and the Austin chemistry group.
Essential Duties and Responsibilities:
Performs data analytics, specifically data clean-up, data processing, predictive modeling, chemometric statistical modeling and analysis, multivariate data analysis, machine learning, and/or data mining, as related to scientific data.
Applies technical skills to plan and execute assigned project work including development of computational models, programming of detection algorithms, and machine learning.
Maintains operational capabilities of computation assets as needed by project requirements.
Leads meetings with company clients by preparing and presenting meeting materials in meetings.
Appropriately annotates project developed computer code through comments and user manuals.
Presents technical results through the drafting of technical reports.
Presents experimental results and recommended actions at internal project meetings.
Supports business development efforts as needed by drafting technical sections of proposals, providing proposal review, assessing levels of effort required to complete proposed work, and brainstorming technical solutions to client problems.
Other duties as assigned.
Required Knowledge, Skills & Abilities:
Ability to plan sequence of experiments to answer complicated technical questions
Ability to lead group of co-workers in execution of a task
Software programming proficiency with Java, C, R, Python, and/or MATLAB
Working knowledge of statistics as it applies to scientific data
Ability to communicate technical information to non-technical audiences
Team player with a positive attitude
Department of Homeland Security Suitability
Department of Defense Secret Clearance
Working knowledge of software development practices including Agile development and Git version control
Sufficient business knowledge to support proposal efforts
Education/Experience:
Incumbent professional should have a Ph.D. or master's degree in a physical science (preferably chemistry), statistics, or data science and significant experience in computer programming, computational modeling, or software development.
Certificates and Licenses:
None
Clearance:
The ability to obtain a Secret clearance and Department of Homeland Security suitability is required for this position.
Supervisory Responsibilities:
The incumbent professional may oversee junior level staff members performing tasks.
Working Conditions/ Equipment:
The incumbent professional is expected to work and/or be available during regular business hours. He/she should also generally be available via e-mail or phone during non-business hours as needed to address critical issues or emergencies. He/she may be required to travel on behalf of the company up to 25%.
The above job description is not intended to be an all-inclusive list of duties and standards of the position. Incumbents will follow any other instructions and perform any other related duties, as assigned by their supervisor.
Powered by ExactHire:160573
Principal Data Engineer - ML Platforms
Arlington, VA jobs
Altarum | Data & AI Center of Excellence (CoE) Altarum is building the future of data and AI infrastructure for public health - and we're looking for a Principal Data Engineer - ML Platforms to help lead the way. In this cornerstone role, you will design, build, and operationalize the modern data and ML platform capabilities that power analytics, evaluation, AI modeling, and interoperability across all Altarum divisions.
If you want to architect impactful systems, enable data science at scale, and help ensure public health and Medicaid programs operate with secure, explainable, and trustworthy AI - this role is for you. What You'll Work On
This role blends deep engineering with applied ML enablement: ML Platform Engineering: modern lakehouse architecture, pipelines, MLOps lifecycle Applied ML enablement: risk scoring, forecasting, Medicaid analytics NLP/Generative AI support: RAG, vectorization, health communications Causal ML operationalization: evaluation modeling workflows Responsible/Trusted AI engineering: model cards, fairness, compliance Your work ensures that Altarum's public health and Medicaid programs run on secure, scalable, reusable, and explainable data and AI infrastructure. What You'll Do
Platform Architecture & Delivery
Design and operate modern, cloud-agnostic lakehouse architecture using object storage, SQL/ELT engines, and dbt.
Build CI/CD pipelines for data, dbt, and model delivery (GitHub Actions, GitLab, Azure DevOps).
Implement MLOps systems: MLflow (or equivalent), feature stores, model registry, drift detection, automated testing.
Engineer solutions in AWS and AWS GovCloud today, with portability to Azure Gov or GCP.
Use Infrastructure-as-Code (Terraform, CloudFormation, Bicep) to automate secure deployments.
Pipelines & Interoperability
Build scalable ingestion and normalization pipelines for healthcare and public health datasets, including:
FHIR R4 / US Core (strongly preferred)
HL7 v2 (strongly preferred)
Medicaid/Medicare claims & encounters (strongly preferred)
SDOH & geospatial data (preferred)
Survey, mixed-methods, and qualitative data
Create reusable connectors, dbt packages, and data contracts for cross-division use.
Publish clean, conformed, metrics-ready tables for Analytics Engineering and BI teams.
Support Population Health in turning evaluation and statistical models into pipelines.
Data Quality, Reliability & Cost Management
Define SLOs and alerting; instrument lineage & metadata; ensure ≥95% of data tests pass.
Perform performance and cost tuning (partitioning, storage tiers, autoscaling) with guardrails and dashboards.
Applied ML Enablement
Build production-grade pipelines for risk prediction, forecasting, cost/utilization models, and burden estimation.
Develop ML-ready feature engineering workflows and support time-series/outbreak detection models.
Integrate ML assets into standardized deployment workflows.
Generative AI Enablement
Build ingestion and vectorization pipelines for surveys, interviews, and unstructured text.
Support RAG systems for synthesis, evaluation, and public health guidance.
Enable Palladian Partners with secure, controlled-generation environments.
Causal ML & Evaluation Engineering
Translate R/Stata/SAS evaluation code into reusable pipelines.
Build templates for causal inference workflows (DID, AIPW, CEM, synthetic controls).
Support operationalization of ARA's applied research methods at scale.
Responsible AI, Security & Compliance
Implement Model Card Protocol (MCP) and fairness/explainability tooling (SHAP, LIME).
Ensure compliance with HIPAA, 42 CFR Part 2, IRB/DUA constraints, and NIST AI RMF standards.
Enforce privacy-by-design: tokenization, encryption, least-privilege IAM, and VPC isolation.
Reuse, Shared-Services, and Enablement
Develop runbooks, architecture diagrams, repo templates, and accelerator code.
Pair with data scientists, analysts, and SMEs to build organizational capability.
Provide technical guidance for proposals and client engagements.
Your First 90 Days - You will make a meaningful impact fast. Expected outcomes include:
Platform skeleton operational: repo templates, CI/CD, dbt project, MLflow registry, tests.
Two pipelines in production (e.g., FHIR → analytics and claims normalization).
One end-to-end CoE lighthouse MVP delivered (ingestion → model → metrics → BI).
Completed playbooks for GovCloud deployment, identity/secrets, rollback, and cost control.
Success Metrics (KPIs)
Pipeline reliability meeting SLA/SLO targets.
≥95% data tests passing across pipelines.
MVP dataset onboarding ≤ 4 weeks.
Reuse of platform assets across ≥3 divisions.
Cost optimization and budget adherence.
What You'll Bring
7-10+ years in data engineering, ML platform engineering, or cloud data architecture.
Expert in Python, SQL, dbt, and orchestration tools (Airflow, Glue, Step Functions).
Deep experience with AWS + AWS GovCloud.
CI/CD and IaC experience (Terraform, CloudFormation).
Familiarity with MLOps tools (MLflow, Sagemaker, Azure ML, Vertex AI).
Ability to operate in regulated environments (HIPAA, 42 CFR Part 2, IRB).
Preferred:
Experience with FHIR, HL7, Medicaid/Medicare claims, and/or SDOH datasets.
Databricks, Snowflake, Redshift, Synapse.
Event streaming (Kafka, Kinesis, Event Hubs).
Feature store experience.
Observability tooling (Grafana, Prometheus, OpenTelemetry).
Experience optimizing BI datasets for Power BI.
Logistical Requirements
At this time, we will only accept candidates who are presently eligible to work in the United States and will not require sponsorship.
Our organization requires that all work, for the duration of your employment, must be completed in the continental U.S. unless required by contract.
If you're near one of our offices (Arlington, VA; Silver Spring, MD; or Novi, MI), you'll join us in person one day every other month (6 times per year) for a fun, purpose-driven Collaboration Day. These days are filled with creative energy, meaningful connection, and team brainstorming!
Must be able to work during Eastern Time unless approved by your manager.
Employees working remotely must have a dedicated, ergonomically appropriate workspace free from distractions with a mobile device that allows for productive and efficient conduct of business.
Altarum is a nonprofit organization focused on improving the health of individuals with fewer financial resources and populations disenfranchised by the health care system. We work primarily on behalf of federal and state governments to design and implement solutions that achieve measurable results. We combine our expertise in public health and health care delivery with technology development and implementation, practice transformation, training and technical assistance, quality improvement, data analytics, and applied research and evaluation. Our innovative solutions and proven processes lead to better value and health for all.
Altarum is an equal opportunity employer that provides employment and opportunities to all qualified employees and applicants without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, or any other characteristic protected by applicable law.
Auto-ApplyClient Integration Data Engineer
Bozeman, MT jobs
Hart is seeking a motivated, data engineer with 3 years of experience to join our Delivery team. This role focuses on developing, maintaining, and improving ETL processes that support EHR migration, archival, and interoperability solutions. The ideal candidate has hands-on experience in healthcare data engineering and is eager to expand their expertise in cloud, security, and compliance frameworks.
Responsibilities
Data Integration: Develop and maintain repeatable, yet customizable, processes for ingesting, transforming, and normalizing diverse structured and unstructured healthcare data from various client sources (e.g., EHRs, HL7, FHIR) using the Hart platform.
Standard Configuration & Validation: Lead the rapid configuration of our standardized data platform for new clients. Implement robust data quality checks and validation rules using code and established frameworks to ensure data accuracy, completeness, and consistency post-integration.
Collaboration & Client Alignment: Work closely with Project Managers, Solution Architects, and external client stakeholders within the Delivery team to translate client data requirements and unique integration challenges into efficient technical solutions.
Documentation & Knowledge Transfer: Maintain thorough and clear documentation for client-specific data mappings, transformation logic, and system configurations to ensure successful handoff and support by the broader operations team.
Data Security & Compliance: Strictly adhere to data protection standards to ensure compliance with healthcare regulations, including HIPAA, SOC 2, GDPR, and HITRUST frameworks across all client integration projects.
Support & Troubleshooting: Rapidly diagnose, troubleshoot, and resolve complex data-related issues that arise during the client integration phase, ensuring minimal disruption to critical project timelines.
Requirements
Qualifications
Bachelor's degree in computer science, information systems, or a related field. Relevant certifications and additional education are a plus.
Strong experience (3+ years) as a Data Engineer, Implementation Consultant, or similar role, with a focus on client delivery and modern ETL/ELT development.
Solid understanding of data transformation concepts, data modeling, and database design principles.
Proficiency in SQL programming, including query development, data manipulation, and validation. Ability to write efficient SQL to support integration and transformation tasks. (Reduced emphasis from "Expertise" to "Proficiency")
Strong expertise (3+ years) with Python for data engineering, specifically utilizing libraries and frameworks such as PySpark or the Pandas ecosystem for building scalable data transformation pipelines. (Shifted focus from general scripting to data-centric packages/frameworks)
Experience (3+ years) with distributed data/computing tools (like Spark/PySpark) and cloud-based data services (AWS, Azure, or GCP). (Re-introduced Spark/PySpark, focusing on the Python binding)
Familiarity with healthcare data standards (e.g., HL7, FHIR) and healthcare-related regulatory requirements (e.g., HIPAA) is highly desirable.
Proven ability to execute repeatable technical processes and translate client requirements into structured configurations.
Strong analytical and problem-solving skills, with a detail-oriented mindset.
Excellent communication and collaboration skills, specifically for interacting with external client teams.
Salary Description $75,000-$103,000