Post job

Data Engineer jobs at Mayo Clinic

- 120 jobs
  • Data Scientist

    Mayo Clinic Health System 4.8company rating

    Data engineer job at Mayo Clinic

    Why Mayo Clinic Mayo Clinic is top-ranked in more specialties than any other care provider according to U.S. News & World Report. As we work together to put the needs of the patient first, we are also dedicated to our employees, investing in competitive compensation and comprehensive benefit plans - to take care of you and your family, now and in the future. And with continuing education and advancement opportunities at every turn, you can build a long, successful career with Mayo Clinic. Benefits Highlights * Medical: Multiple plan options. * Dental: Delta Dental or reimbursement account for flexible coverage. * Vision: Affordable plan with national network. * Pre-Tax Savings: HSA and FSAs for eligible expenses. * Retirement: Competitive retirement package to secure your future. Responsibilities Join a world-renowned institution where data science directly improves patient lives. At Mayo Clinic, Data Scientists turn complex, heterogeneous data into meaningful insights that enhance clinical care, accelerate scientific discovery, and strengthen the digital future of healthcare. This role partners closely with Data Science and Informatics faculty and serves as a technical thought leader-shaping the strategy, direction, and impact of data science across the enterprise. As a Data Scientist at Mayo Clinic, you will: * Transform data into insight and insight into action, spanning the full spectrum from problem formulation and data acquisition to modeling, deployment, and interpretation. * Provide strategic direction for data science and AI in a specialized domain-such as cancer, surgery, healthcare delivery, marketing, or planning services. * Collaborate with enterprise leaders to advance Mayo Clinic's digital and analytics strategy. * Work side‑by‑side with Informatics and IT teams to build data‑ and intelligence‑driven systems that address complex, high‑priority challenges. * Recommend best practices for data collection, integration, and retention, incorporating technical, operational, and business needs. * Support scientific and operational initiatives under the guidance of a senior data scientist or through full independent direction. * Design and develop scripts, tools, and software applications that enable data extraction, management, and analysis across the organization. * Deliver enterprise‑level consultative services, providing analysis and presenting findings to leadership and multidisciplinary stakeholders. * In Generative AI, a Data Scientist will lead and execute Gen AI/Agentic based approaches Key Responsibilities * Partner with multidisciplinary teams to create innovative approaches for data‑driven decision‑making. * Lead exploration of next‑generation AI/ML approaches to solve complex analytical problems across diverse domains. * Apply deep expertise in data science methods, data types, and scientific challenges to help shape new products, experiences, and technologies. * Guide and mentor data science staff, ensuring high‑quality analysis, unbiased recommendations, and alignment with strategic priorities. * Develop analytics tools and solutions that can be used effectively by non‑technical staff. Qualifications * PhD in a domain relevant field (mathematics, computer science, statistics, physics, engineering, data science, health science, or a related discipline) Plus * At least four years of experience in data science, machine learning, AI, or informatics What You Bring * A blend of deep technical expertise and strong business acumen, with a proven ability to lead technical or quantitative teams. * Demonstrated success developing predictive and prescriptive models using advanced statistical modeling, machine learning, or data mining. * Experience applying problem‑solving frameworks, planning methods, continuous improvement approaches, and project management techniques. * Ability to independently manage multiple high‑impact projects in a dynamic environment, staying current with healthcare trends and enterprise priorities. * Exceptional interpersonal skills-presentation, negotiation, persuasion, and written communication. * Strong time‑management skills and the ability to prioritize, organize, and delegate effectively. * Expertise in scientific computing, data management packages, data modeling, and data exploration tools. * A consulting mindset: the ability to identify challenges, recommend solutions, deploy analytics tools, and support non‑technical users. * Demonstrated initiative in areas such as training, software development, education, and technical documentation. * Proven ability to provide vision and strategic direction at the departmental, institutional, or enterprise level. * Preferred publications in high impact factors Exemption Status Exempt Compensation Detail Compensation range is $165,276.80 - $247,998.40 / Salary. This vacancy is not eligible for sponsorship, and we will not sponsor or transfer visas for this position. Benefits Eligible Yes Schedule Full Time Hours/Pay Period 80 Schedule Details Monday - Friday, 8am - 5pm CST. Some on-site travel may be required for retreats or meetings. Weekend Schedule As needed. International Assignment No Site Description Just as our reputation has spread beyond our Minnesota roots, so have our locations. Today, our employees are located at our three major campuses in Phoenix/Scottsdale, Arizona, Jacksonville, Florida, Rochester, Minnesota, and at Mayo Clinic Health System campuses throughout Midwestern communities, and at our international locations. Each Mayo Clinic location is a special place where our employees thrive in both their work and personal lives. Learn more about what each unique Mayo Clinic campus has to offer, and where your best fit is. Equal Opportunity All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity, sexual orientation, national origin, protected veteran status or disability status. Learn more about the 'EOE is the Law'. Mayo Clinic participates in E-Verify and may provide the Social Security Administration and, if necessary, the Department of Homeland Security with information from each new employee's Form I-9 to confirm work authorization. Recruiter Briana Priniski
    $71k-112k yearly est. 6d ago
  • Software Engineer - Pipeline Developer

    Mayo Clinic 4.8company rating

    Data engineer job at Mayo Clinic

    Mayo Clinic Genomics and Emerging Systems Unit is recruiting a Software Engineer for a Next Generation Sequencing Pipeline Development and Validation programmer to provide application development and support for genomic analysis, tooling and infrastructure focused on automating and simplifying the data processes for both externally and internally generated genomics data from inception through final bioinformatics analysis and delivery. This position will support the needs of the Clinical Genome Sequencing Laboratory (CGSL). This position will work closely with our Consultants and our Bioinformatics team along with the Genetic Counselors, Lab Supervisors, Technicians and Operators to align, develop, and solution the tools necessary to implement a wide range of Next Generation Sequencing (NGS) tests from small gene panels to Whole Genome Sequencing, to the Epigenome and Microbiome tests. This position will be responsible for the design, development, implementation, and maintenance of custom system software, and/or the installation and maintenance of purchased systems software, as well as the configuration and support of computational cluster hardware systems in a variety of genomics and bioinformatics areas. Active participation in Agile and DevOps methodologies is expected, utilizing tools such as GitHub, Azure DevOps (ADO), Azure Pipelines, and CI/CD frameworks. Responsibilities may also extend to deployment automation and configuration management using platforms like Terraform and Azure Pipelines, supporting both on-premises and cloud-hosted applications. Evaluates alternative approaches and presents recommendations to teams and unit leadership. Regularly reviews applications and makes modifications and/or updates to ensure currency and functionality within established environment. Produces and maintains documentation such as system requirements, designs, resource inventories, and plans. Provides technical and project leadership to other staff members, demonstrates initiative, and works independently as needed to accomplish responsibilities. Works effectively across departmental organizations gaining consensus of stakeholders. May interface with vendor support service groups or other external support teams to ensure proper escalation during outages or periods of degraded system performance. Acts as a liaison to Mayo departments and vendors to adequately support the division's computing systems. Interfaces routinely with colleagues who may be located at any of the Mayo Group practices to perform job responsibilities requiring virtual collaboration and partnership. This position will be required to provide 24/7 on-call support in a team rotation. Designs and builds back-end services that support our portfolio of data-centric clinical and analytic applications. These applications leverage cloud computing, big data, mobile, data science, data warehousing, machine learning using state of the art software development applications and frameworks. Our Software Engineers ensures that these cloud-based micro-services adhere to uptime and accuracy targets, are resilient, and scale as data volumes and traffic increase. They work closely with the data engineering, platform, and solutions teams to develop applications as required to benefit our practice and patients. Works closely with the Product Owners, Product Managers, Architects to translate requirements into code. Developing services around data warehousing, big data, cloud computing, business intelligence, analytics and machine learning. Participate in DevOps, Agile, continuous development and integration frameworks. Programming in high-level languages such as Go, Python, Java etc. Work on deployment automation/configuration management with tools including but not limited to ADO, Puppet, Chef or Ansible or Azure Pipelines, CloudFormation, Terraform following a DevOps model. Ensure all appropriate documentation of processes and source code is created and maintained. Communicate effectively with peers, leaders, and customers throughout the organization. Participate in expert level troubleshooting and resolve problems through root cause analysis, data and system investigation. Continues to build knowledge of the organization, processes and customers. Performs a range of mainly straightforward assignments. Uses prescribed guidelines or policies to analyze and resolve problems. Receives a moderate level of guidance and direction. This is a full time, remote position within the United States. Mayo Clinic will not sponsor or transfer visas for this position including F1 OPT STEM. Bachelor's Degree in Computer Science/Engineering or related field; Or an Associates' degree in Computer Science/Engineering or related field with an additional 2 years of experience as described below. · Have working knowledge and experience of Software Engineering with a minimum of internships and a minimum of 1 yr. of experience, or 2yrs of experience coding applications or services in a high-level language (C, C++, Golang, Java, C# etc.). · Demonstrated problem solving and time management skills. · Possesses strong technical aptitude for designing and implementing software solutions. · Experience with modern application development frameworks · Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations. · Deep hands-on technical expertise, excellent verbal and written communication skills. · Experience with Agile software development techniques. Preferred qualifications for this position include: · Ability to use a wide variety of open-source technologies and cloud-based services. · Experience with Google and Azure cloud environments · Experience in databases, analytics, big data systems or business intelligence products · Experience with building high-performance, highly available and scalable distributed systems. · Experience developing software for healthcare related industries. Preferred Qualifications: • Demonstrated ability to learn new scripting languages and tools quickly • Knowledge of bioinformatics applications and tools and pipelines such as Illumina BCL Conversion, Genomic Aligners and Variant callers such as GATK or Sentieon. • Knowledge of Nextflow or other pipeline management software a plus. • Knowledge of Python, BASH, R and other scripting and Linux/Unix tools. • Previous knowledge of Genetics software pipelines and tooling for the laboratories • Must possess excellent analytical, problem-solving, and technical design skills. • Must have knowledge of system architecture and design principles, software development methodologies, and computer programming. • Must be willing to work in an Agile team environment as well as being comfortable working independently with minimal supervision • Must be highly motivated and willing to learn and implement leading edge technologies. T • The candidate must be comfortable interacting with researchers and consultants both internal and external to Mayo Clinic. • Experience with clustered/distributed environments using Oracle Grid Engine is highly desirable. • Familiarity with cloud integration CLI tooling such as awscli, gcloud, and azurecli is a plus. • Candidate with genomics domain expertise and some exposure to clinical or research laboratory work and equipment are highly desirable • Bachelor's or Master's degree in Computer Science or related field is preferred
    $82k-134k yearly est. Auto-Apply 3d ago
  • Data Scientist

    Cdc Foundation 4.6company rating

    Hawaii jobs

    The Data Scientist will lead the development of datasets for data-driven visualizations and performance metrics to support public health decision-making within the Hawaii Department of Health (HDOH) as part of the Hawaii Data & Analytics Modernization Project. This role is embedded in the Workforce Acceleration Initiative (WAI), a federally funded CDC Foundation program aimed at enhancing public health agencies' technology and data systems. The Data Scientist will focus on making datasets accessible and designing and implementing PowerBI dashboards for over 40 HDOH programs. This comprises importing datasets to the DOH Enterprise Data Warehouse (EDW) in Azure, developing robust Extract, Transform, Load (ETL) processes, and delivering a Train-the-Trainer program to build staff capacity for data pipelines and dashboards maintenance. The role requires close collaboration with HDOH program staff, Subject Matter Experts (SMEs), and the Health Data & Informatics Office (HDIO) staff PowerBI and IT teams. The role ensures data is made accessible and dashboards are developed to align with Public Health Accreditation Board (PHAB) standards, Continuous Quality Improvement (CQI) initiatives, and HDOH program goals. The Data Scientist will integrate stakeholder engagement to address diverse programmatic needs and support the synthesis of a core data model within the data warehouse across HDOH programs. This position is eligible for a fully remote work arrangement for U.S.-based candidates, hired through the CDC Foundation, and assigned to HDOH's Health Data & Informatics Office. Responsibilities · Data Management and ETL Process Development: o Assess, plan, and develop data pipelines from multiple operational systems for ingest into a common HDOH data warehouse, with appropriate governance controls, for use in analysis and reporting projects. o Assess and manage ETL processes to support data visualization for over 40 HDOH programs, ensuring reliable data integration and quality from source systems to the HDOH data warehouse, for PowerBI dashboards. o Collaborate with HDIO's PowerBI and IT teams to design and implement ETL workflows, integrating disparate data sources into a unified core data model. o Develop and maintain standardized data collection, quality, and processing protocols to ensure data accuracy, consistency, and timeliness. o Produce updated weekly progress reports on ETL and data management activities, accessible to all relevant stakeholders, to ensure transparency and alignment. · PowerBI Dashboard Development: o Lead the design, development, and deployment of interactive PowerBI dashboards for over 40 HDOH programs, visualizing previously identified KPIs and performance metrics, alongside HDOH technical staff. o Ensure dashboards are user-friendly, with drill-down capabilities, and aligned with PHAB accreditation readiness, CQI initiatives, and HDOH program goals. o Conduct iterative design reviews and usability testing with HDOH staff to refine dashboard functionality and address program-specific needs. o Produce weekly progress reports on dashboard development, accessible to all relevant stakeholders, to document milestones and incorporate feedback. · Train-the-Trainer Program Development: o Develop and deliver a Train-the-Trainer program to equip HDOH staff with skills to create, maintain, and update data pipelines and PowerBI dashboards, including ETL details for troubleshooting. o Design modular, flexible training sessions with recorded materials and user-friendly guides to accommodate varying schedules and skill levels. o Provide ongoing support through a helpdesk or peer mentoring system to reinforce learning and ensure long-term sustainability. o Produce updated weekly progress reports on training efforts, accessible to all relevant stakeholders, to monitor adoption and impact. · Project Coordination and Stakeholder Collaboration: o Oversee coordination and execution of dashboard development, metrics refinement, and training, ensuring integration with interrelated projects (data governance, EDSS modernization, core data model synthesis). o Lead regular project meetings with HDOH staff, SMEs, and HDIO's PowerBI and IT teams to review progress, address issues, and ensure PHAB alignment. o Use project management tools to track ETL, dashboard, and training milestones, allocating resources to meet timelines and stakeholder expectations. o Produce updated weekly progress reports on project coordination, accessible to all relevant stakeholders, to maintain clear communication and accountability. · Risk Management and Communication: o Identify and mitigate risks in ETL processes, dashboard usability, and training adoption, collaborating with HDIO teams to ensure data accuracy and stakeholder satisfaction. o Develop stakeholder communication materials (reports, presentations) using data visualization tools to share progress, ETL performance, and dashboard insights with clarity. Qualifications · Education: o Bachelor's degree in Information Systems, Data Science, Public Health, Computer Science, or a related field. Master's degree in a similar field preferred but not required. · Experience: o 7-10 years of experience in data management, ETL development or maintenance, business intelligence, or public health informatics, ideally with a focus on healthcare or public health IT systems. o Demonstrated expertise in designing and implementing ETL processes and data management frameworks for data integration and quality assurance. o Significant experience developing and deploying PowerBI dashboards for performance tracking and visualization in complex organizational settings. o Experience working with public health agencies or healthcare systems, particularly in data systems, performance metrics, and stakeholder engagement, is highly desirable. · Technical Skills: o Demonstrated proficiency in developing and maintaining ETL processes, database design, and data quality assurance, with experience integrating disparate data sources into a common data warehouse and associated common data model. o Proficiency with data quality tools and data catalogs, for developing and maintaining consistent data resources or data products for an organization. o Proficiency in PowerBI for dashboard design, data modeling, and DAX (Data Analysis Expressions) for advanced analytics. o Familiarity with SQL for querying and managing relational databases. o Experience with project management tools for tracking milestones and resource allocation. · Public Health and Evaluation Knowledge: o Understanding of public health workflows, data collection methods, data quality methods, and evaluation methodologies for performance metrics. o Familiarity with PHAB standards, CQI initiatives, and national/international public health metrics frameworks is desired. o Experience implementing training programs for technical tools (e.g., PowerBI, ETL processes) in public health settings. · Communication and Collaboration: o Exceptional interpersonal and communication skills to facilitate collaboration with diverse stakeholders, including public health professionals, technical teams, and leadership, while demonstrating cultural sensitivity and respect for Hawaii's unique cultural context. o Ability to adapt communication styles and approaches to align with HDOH values, fostering trust and effective partnerships with HDOH staff and community stakeholders. o Proven ability to bridge technical and business requirements, ensuring alignment between data solutions and organizational goals, while being mindful of cultural nuances and organizational priorities. o Experience managing stakeholders and leading cross-functional teams in fast-paced environments, with a focus on building inclusive and culturally responsive collaborations. · Project Management: o Strong project management skills, including planning, creating work breakdown structures, and tracking milestones. o Ability to manage multiple priorities, meet deadlines, and solve complex problems with minimal supervision. o Experience with organizational change management, preferably using models like ADKAR. Job Highlights · Location: Remote, must be based in the United States. The individual must align their work hours with Hawaii Standard Time (HST) to ensure effective collaboration and communication with HDOH teams and stakeholders. Resources based in the Western US time zones preferred. · Salary Range: $92,700-$134,275 per year, plus benefits. Individual salary offers will be based on experience and qualifications. · Position Type: Grant-funded, limited-term opportunity. · Position End Date: June 30, 2026. Special Notes This role is involved in a dynamic public health program. As such, roles and responsibilities are subject to change as situations evolve. Roles and responsibilities listed above may be expanded upon or updated to match priorities and needs, once written approval is received by the CDC Foundation in order to best support the public health programming. All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of race, color, religion, sex, national origin, age, mental or physical disabilities, veteran status, and all other characteristics protected by law. We comply with all applicable laws including E.O. 11246 and the Vietnam Era Readjustment Assistance Act of 1974 governing employment practices and do not discriminate on the basis of any unlawful criteria in accordance with 41 C.F.R. §§ 60-300.5(a)(12) and 60-741.5(a)(7). As a federal government contractor, we take affirmative action on behalf of protected veterans. The CDC Foundation is a smoke-free environment. Relocation expenses are not included. About the CDC Foundation The CDC Foundation helps CDC save and improve lives by unleashing the power of collaboration between CDC, philanthropies, corporations, organizations and individuals to protect the health, safety and security of America and the world. The CDC Foundation is the go-to nonprofit authorized by Congress to mobilize philanthropic partners and private-sector resources to support CDC's critical health protection mission. The CDC Foundation manages hundreds of programs each year impacting a variety of health threats from chronic disease conditions including cardiovascular disease and cancer, to infectious diseases like rotavirus and HIV, to emergency responses, including COVID-19 and Ebola. Visit ********************* for more information.
    $92.7k-134.3k yearly Auto-Apply 60d+ ago
  • Senior Data Engineer

    Care Access 4.3company rating

    Boston, MA jobs

    Job Description Care Access is working to make the future of health better for all. With hundreds of research locations, mobile clinics, and clinicians across the globe, we bring world-class research and health services directly into communities that often face barriers to care. We are dedicated to ensuring that every person has the opportunity to understand their health, access the care they need, and contribute to the medical breakthroughs of tomorrow. With programs like Future of Medicine , which makes advanced health screenings and research opportunities accessible to communities worldwide, and Difference Makers , which supports local leaders to expand their community health and wellbeing efforts, we put people at the heart of medical progress. Through partnerships, technology, and perseverance, we are reimagining how clinical research and health services reach the world. Together, we are building a future of health that is better and more accessible for all. To learn more about Care Access, visit ******************* How This Role Makes a Difference We are seeking an experienced and detail-oriented professional to join our team as a Sr. Data Engineer. In this pivotal role, you will be responsible for designing, developing, and maintaining robust data pipelines that ensure the reliable ingestion, transformation, and delivery of complex data (demographics, medical, financial, marketing, etc.) across systems. The ideal candidate will bring deep expertise in Databricks, SQL, and modern data engineering practices, along with strong collaboration skills to help drive excellence across our data infrastructure. How You'll Make An Impact Data Engineering Strategy and Architecture: Design and implement scalable, reliable, and efficient data pipelines to support clinical, operational, and business needs. Develop and maintain architecture standards, reusable frameworks, and best practices across data engineering workflows. Build automated systems for data ingestion, transformation, and orchestration leveraging cloud-native and open-source tools. Data Infrastructure and Performance Optimization: Optimize data storage and processing in data lakes and cloud data warehouses (Azure, Databricks). Develop and monitor batch and streaming data processes to ensure data accuracy, consistency, and timeliness. Maintain documentation and lineage tracking across datasets and pipelines to support transparency and governance. Collaboration and Stakeholder Engagement: Work cross-functionally with analysts, data scientists, software engineers, and business stakeholders to understand data requirements and deliver fit-for-purpose data solutions. Review and refine work completed by other team members, ensuring quality and performance standards are met. Provide technical mentorship to junior team members and collaborate with contractors and third-party vendors to extend engineering capacity Technology and Tools: Use Databricks, DBT, Azure Data Factory, and SQL to architect and deploy robust data engineering solutions. Integrate APIs, structured/unstructured data sources, and third-party systems into centralized data platforms. Evaluate and implement new technologies to enhance the scalability, observability, and automation of data operations. Other Responsibilities: Continuous Improvement: Proactively suggest improvements to infrastructure, processes, and automation to improve system efficiency, reduce costs, and enhance performance. The Expertise Required Strong expertise in Databricks, SQL, dbt, Python, and cloud data ecosystems such as Azure. Experience working with structured and semi-structured data from diverse domains. Familiarity with CI/CD pipelines, orchestration tools (e.g., Airflow, Azure Data Factory), and modern software engineering practices. Strong analytical and problem-solving skills, with the ability to address complex data challenges and drive toward scalable solutions. Certifications/Licenses, Education, and Experience: Bachelor's or master's degree in computer science, Information Systems, Engineering, or a related field. 5+ years of experience in data engineering with a proven track record of building cloud-based, production-grade data pipelines. How We Work Together This role requires 100% of work to be performed in a remote office environment and requires the ability to use keyboards and other computer equipment. This is a remote position with less than 10% travel requirements. Occasional planned travel may be required as part of the role. The expected salary range for this role is $120,000-$160,000 USD per year. In addition to base pay, employees may be eligible for 401k, stock options, health and wellness benefits and paid time off. Diversity & Inclusion We work with and serve people from diverse cultures and communities around the world. We are stronger and better when we build a team representing the communities we support. We maintain an inclusive culture where people from a broad range of backgrounds feel valued and respected as they contribute to our mission. We are an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to, and will not be discriminated against on the basis of, race, color, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law. Care Access is unable to sponsor work visas at this time. If you need an accommodation to apply for a role with Care Access, please reach out to: ********************************
    $120k-160k yearly 6d ago
  • Data Engineer

    Cdc Foundation 4.6company rating

    Oregon jobs

    The Data Engineer will play a crucial role in advancing the CDC Foundation's mission by designing, building, and maintaining modern data infrastructure for the Northwest Portland Area Indian Health Board (NPAIHB) Data Hub project. Working closely with the Data Hub Team, the Data Engineer will support the architecture needed for data storage, processing, analysis, and secure transfer to Tribal Leaders and public health professionals. The Data Engineer will collaborate with epidemiologists, data content experts, IT staff, the Data Hub Project Director, and others to develop and implement scalable solutions that align with the objectives of the NPAIHB's Data Hub project. NPAIHB's Data Hub Team is currently developing a system, “The NW Tribal Data Hub,” to provide comprehensive, user-friendly public health data dashboards for its 43 member Tribes. The Data Engineer will ensure the successful design and implementation of a newly created public health database, the ingestion of additional data into the system, and create tables, views, and other database structures to support epidemiological analysis, visualization, and reporting to Tribes. The data, sourced primarily from state and federal agencies, include vital statistics (births, deaths), cancer registries, emergency department, clinical service data, and others. The Data Engineer's work will be pivotal in enhancing the capacity of Tribal public health departments to conduct data-driven activities, advancing Tribal data sovereignty, and empowering Tribes to improve health outcomes within their communities. The Data Engineer will be hired by the CDC Foundation and assigned to the Data Hub Team at NPAIHB. This position is eligible for a fully remote work arrangement for U.S. based candidates. NPAIHB is a tribally owned and operated non-profit organization serving the 43 federally recognized Tribes in the states of Idaho, Oregon, and Washington. Led by a Board of Directors, NPAIHB's mission is to “eliminate health disparities and improve the quality of life of American Indians and Alaska Natives by supporting Northwest Tribes in their delivery of culturally appropriate, high-quality health programs and services.” NPAIHB is a mission-driven organization with a staff of over 120 professionals dedicated to advancing Tribal health for the 7th generation in the Pacific Northwest. Responsibilities Create new and enhance existing systems and pipelines that enable efficient, reliable, and secure flow of data, including ingestion, processing, and storage. Load data into storage systems or data warehouses, transforming, cleaning, and organizing with dimensional modeling techniques to ensure accuracy, consistency, and efficient querying. Transform and structure data to ensure it is optimized for use in data visualization software, enabling accurate and effective visual representations of epidemiological data. Ensure thorough and clear documentation of database architecture and workflows to promote sustainability, consistency, and ease of maintenance. Apply rigorous data quality checks and validation processes to guarantee the accuracy and reliability of the data released, emphasizing the importance of delivering correct and trustworthy data to support public health initiatives. Optimize data pipelines, infrastructure, and workflows for performance and scalability. Monitor data pipelines and systems for performance issues, errors, and anomalies, and implement solutions to address them. Implement security measures to protect sensitive information. Collaborate with epidemiologists, analysts, and other partners to understand current and future data needs and requirements, and to ensure that the data infrastructure supports the organization's goals and objectives. Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs. Implement and maintain ETL processes to ensure the accuracy, completeness, and consistency of data. Design and manage data storage systems, including a PostgreSQL relational database Apply knowledge about industry trends, best practices, and emerging technologies in data engineering, and incorporate the trends into the organization's data infrastructure. Provide technical guidance to other staff on preparing and structuring data for visualization, leveraging knowledge of visualization tools to support the creation of meaningful and insightful visual outputs. Communicate effectively with partners at all levels of the organization to gather requirements, provide updates, and present findings. Qualifications Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Minimum of five (5) years of related informatics experience, preferably with three (3) years of experience in a lead data engineer position. Demonstrated expertise in building SQL relational databases and transitioning non-relational data into a structured relational format, ensuring seamless integration and optimized performance. Proficiency in SQL programming and other languages commonly used in data engineering, such as Python, PySpark, Java, Scala. Candidate should be able to implement data automations within existing frameworks as opposed to writing one off scripts. Experience transforming and preparing data into formats suitable for data visualization software, ensuring it is structured for optimal use in dashboards and other visual outputs. Strong understanding of database systems, including relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra), with PostgreSQL preferred. Experience regarding engineering best practices such as source control, automated testing, continuous integration and deployment, and peer review, and serving as a subject matter expert on these topics. Knowledge of data warehousing concepts and tools. Experience with cloud computing platforms, with preference for experience in AWS environment. Expertise in data modeling, ETL (Extract, Transform, Load) processes, and data integration techniques. Familiarity with agile development methodologies, software design patterns, and best practices. Strong analytical thinking and problem-solving abilities. Excellent verbal and written communication skills, including the ability to convey technical concepts to non-technical partners effectively. Flexibility to adapt to evolving project requirements and priorities. Outstanding interpersonal and teamwork skills; and the ability to develop productive working relationships with colleagues and partners. Experience working in a virtual environment with remote partners and teams. Proficiency in Microsoft Office. Ability to travel occasionally for in-person meetings (travel costs will be covered by NPAIHB). Preferred Qualifications Experience gathering requirements and designing and planning data models based on those requirements. Experience creating complex fields and visuals in AWS QuickSight or similar data visualization tools (Tableau, Microsoft Power BI, etc). Experience building data pipelines within Amazon Web Services (AWS), such as AWS Relational Database Services (RDS), Amazon Aurora Serverless, AWS Glue, Lambda Experience working with complex public health, health care, or other non-business data requiring advanced processing and analysis techniques. Experience transitioning SAS datasets and analyses into relational database structures. Experience with dimensional modeling in scenarios where dimensions and fields change over time. Experience with implementing data suppression techniques and familiarity with HIPAA, PHI, and other data confidentiality regulations. Job Highlights Location: Remote, must be based in the United States Salary Range: $103,500-143,500, plus benefits. Individual salary offers will be based on experience and qualifications unique to each candidate. Position Type: Grant funded, limited-term opportunity Position End Date: June 30, 2026 Special Notes This role is involved in a dynamic public health program. As such, roles and responsibilities are subject to change as situations evolve. Roles and responsibilities listed above may be expanded upon or updated to match priorities and needs, once written approval is received by the CDC Foundation in order to best support the public health programming. The CDC Foundation is a smoke-free environment. Relocation expenses are not included. All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of race, color, religion, sex, national origin, age, mental or physical disabilities, veteran status, and all other characteristics protected by law. We comply with all applicable laws including E.O. 11246 and the Vietnam Era Readjustment Assistance Act of 1974 governing employment practices and do not discriminate on the basis of any unlawful criteria in accordance with 41 C.F.R. §§ 60-300.5(a)(12) and 60-741.5(a)(7). As a federal government contractor, we take affirmative action on behalf of protected veterans. About the CDC Foundation The CDC Foundation helps CDC save and improve lives by unleashing the power of collaboration between CDC, philanthropies, corporations, organizations and individuals to protect the health, safety and security of America and the world. The CDC Foundation is the go-to nonprofit authorized by Congress to mobilize philanthropic partners and private-sector resources to support CDC's critical health protection mission. The CDC Foundation manages hundreds of programs each year impacting a variety of health threats from chronic disease conditions including cardiovascular disease and cancer, to infectious diseases like rotavirus and HIV, to emergency responses, including COVID-19 and Ebola. Visit ********************* for more information.
    $103.5k-143.5k yearly Auto-Apply 18d ago
  • John Snow Labs US-Based Healthcare Data Scientist

    John Snow Labs 4.4company rating

    Delaware City, DE jobs

    John Snow Labs is an award-winning AI and NLP company, accelerating progress in data science by providing state-of-the-art software, data, and models. Founded in 2015, it helps healthcare and life science companies build, deploy, and operate AI products and services. John Snow Labs is the winner of the 2018 AI Solution Provider of the Year Award, the 2019 AI Platform of the Year Award, the 2019 International Data Science Foundation Technology award, and the 2020 AI Excellence Award. John Snow Labs is the developer of Spark NLP - the world's most widely used NLP library in the enterprise - and is the world's leading provider of state-of-the-art clinical NLP software, powering some of the world's largest healthcare & pharma companies. John Snow Labs is a global team of specialists, of which 33% hold a Ph.D. or M.D. and 75% hold at least a Master's degree in disciplines covering data science, medicine, software engineering, pharmacy, DevOps and SecOps. Job Description John Snow Labs is seeking a highly skilled and motivated Data Scientist to contribute to transformative initiatives within the healthcare industry. The ideal candidate will possess a strong background in developing and optimizing machine learning models, specifically within healthcare contexts. We are looking for a results-oriented individual proficient in training and fine-tuning models, building robust, production-ready model inference pipelines, and conducting comprehensive exploratory data analysis and data enrichment. Qualifications Key Responsibilities: Train, fine tune, and enhance LLM & NLP models using the open-source Python library ecosystem. Experience with LLMs, Generative AI, and deep learning is a significant advantage. Build data science and data engineering pipelines specific to analyzing clinical data, such as extracting information from medical text or images, or integrating uncertain information from multiple medical data sources. Collaborate with our team on customer-facing projects, utilizing your expertise to create advanced machine learning, deep learning, large language models, and time series forecasting pipelines tailored to address specific business needs. Ensure models are validated for issues like bias, overfitting, and concept drift to ensure reliability and effectiveness. Engage directly with customers, requiring strong oral and written communication skills to convey complex technical concepts clearly. Mandatory Skills: Proven experience in consistently delivering real-world projects covering the key responsibilities. Knowledge that is limited to an academic setting, or to using existing APIs to building applications, is not sufficient for this role. Hands-on experience with OMOP, FHIR, clinical terminologies, and understanding of the patient journey. Strong background in healthcare-related fields such as medicine, pharma, bioinformatics, or biostatistics is highly beneficial. A PhD in a relevant field is preferred but not required if exceptional experience is demonstrated. Experience with John Snow Labs' technology stack, such as Spark NLP or the medical language models, is a plus. What We Offer: A chance to work on cutting-edge problems in healthcare and life sciences, contributing to meaningful projects that impact patient outcomes. Long-term freelancing contracts with a commitment of at least 30 hours per week. We are seeking individuals, not agencies or teams. The opportunity to grow your skills and knowledge, working with a team of big data and data science experts in a supportive, collaborative environment. To apply, please include the words 'John Snow Labs' in your cover letter and detail why you believe you are the best fit for this role. This is more than just a contract - it's a chance to make a real difference. Additional Information Our Commitment to You At John Snow Labs, we believe that diversity is the catalyst of innovation. We're committed to empowering talented people from every background and perspective to thrive. We are an award-winning global collaborative team focused on helping our customers put artificial intelligence to good use faster. Our website includes The Story of John Snow, and our Social Impact page details how purpose and giving back is part of our DNA. More at JohnSnowLabs.com We are a fully virtual company, collaborating across 28 countries. This is a contract opportunity, not a full-time employment role. This role requires the availability of at least 30-40 hours per week.
    $77k-102k yearly est. 60d+ ago
  • Senior Data Engineer

    Search 3.5company rating

    Remote

    At MCG, we lead the healthcare community to deliver patient-focused care. We have a mission-driven team of talented physicians and technical experts developing our evidence-based content and innovating our products to accelerate improvements in healthcare. If you are driven to enhance the US healthcare system, MCG is eager to have you join our team. We cultivate a work environment that nurtures personal and professional growth, and this is a thrilling time to become a part of our organization. With dynamic roles that offer meaningful impact, you'll be able to fully realize your potential. Plus, you'll enjoy world-class benefits and the security, stability, and resources of our parent company, Hearst, with over 100 years of experience. As a Senior Data Engineer you will be responsible for enabling efficient and effective data ingestion & delivery systems. Our team collaborates with data producers (application teams) and data consumers/stakeholders (Data Science, Product, Analytics & Reporting teams) to ensure the availability, quality, and accessibility of data through robust pipelines and storage platforms. You will: Explore, analyze, and onboard data sets from data producers to ensure they are ready for processing and consumption. Develop and maintain scalable and efficient data pipelines for data collection, processing (quality checks, de-duplication, etc.), and integration into Data lake and Data warehouse systems. Optimize and monitor data pipeline performance to ensure minimal downtime. Implement data quality control mechanisms to maintain data set integrity. Collaborate with stakeholders for seamless data flow and address issues or needs for improvement. Manage the deployment and automation of pipelines and infrastructure using Terraform, Flyte, and Kubernetes. Support strategic data analysis and operational tasks as needed. Lead end-to-end data pipeline development - from initial data discovery and ingestion to transformation, modeling, and delivery into production-grade data platforms. Integrate and manage data from 3+ distinct sources, designing efficient, reusable frameworks for multi-source data processing and harmonization. What We're Looking For: Demonstrated ability to navigate ambiguous data challenges, ask the right questions, and design effective, scalable solutions. Proficient in designing, building, and maintaining large-scale, reliable data pipeline systems. Competence in designing and handling large-scale data pipeline systems. Advanced SQL skills for querying and processing data. Proficiency in Python, with experience in Spark for data processing. 3+ years of experience in data engineering, including data modeling and ETL pipelines. Familiarity with cloud-based tools and infrastructure management using Terraform and Kubernetes is a plus. Bonus: Experience working with healthcare and clinical data sets Experience with orchestration tools like Flyte Pay Range: $136,000 - $190,400 Other compensation: Bonus Eligible Perks & Benefits: 💻 Hybrid work ✈️ Travel expected 2-3 times per year for company-sponsored events 🩺 Medical, dental, vision, life, and disability insurance 📈 401K retirement plan; flexible spending and health savings account 🏝️ 15 days of paid time off + additional front-loaded personal days 🏖️ 14 company-recognized holidays + paid volunteer days 👶 up to 8 weeks of paid parental leave + 10 weeks of paid bonding leave 🌈 LGBTQ+ Health Services 🐶 Pet insurance 📣 Check out more of our benefits here: ******************************************* MCG Health is a Seattle, Washington-based company and is considering remote/hybrid candidates with some travel for company-sponsored events. The ideal candidate should be comfortable balancing the independence of remote/hybrid work with the collaborative opportunities offered by periodic in-person engagements. We embrace diversity and equal opportunity and are committed to building a team that represents a variety of backgrounds, perspectives, and skills. Only with diverse thoughts and ideas will we be able to create the change we want in healthcare. The more inclusive we are, the better our work will be for it. All roles at MCG are expected to engage in occasional travel to participate in team or company-sponsored events for the purposes of connection and collaboration. It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. MCG is a leading healthcare organization dedicated to patient-focused care. We value our employees' unique differences and are an Equal Employment Opportunity (EEO) employer. Our diverse workforce helps us achieve our goal of providing the right care to everyone. We welcome all qualified applicants without regard to race, religion, nationality, gender, sexual orientation, gender identity, age, marital status, veteran status, disability, pregnancy, parental status, genetic information, or political affiliation. We are committed to improving equity in healthcare and believe that a diverse workplace fosters curiosity, innovation, and business success. We are happy to provide accommodations for individuals. Please let us know if you require any support.
    $136k-190.4k yearly Auto-Apply 12d ago
  • Data Engineer

    Cdc Foundation 4.6company rating

    South Dakota jobs

    The Data Engineer will play a crucial role in advancing the designing, building, and maintaining data infrastructure. This role is aligned to the Workforce Acceleration Initiative (WAI). WAI is a federally funded CDC Foundation program with the goal of helping the nation's public health agencies by providing them with the technology and data experts they need to accelerate their information system improvements. Working within South Dakota Department of Health's (SD-DOH) Epidemiology, Surveillance and Informatics Center (ESIC), the Data Engineer will play a pivotal role in documenting and maintaining the case-based disease surveillance system architecture required for data generation, storage, processing, and analysis. This position is eligible for a fully remote work arrangement for U.S. based candidates. Responsibilities · Document the existing architecture of the case-based disease surveillance system, including system interactions and data generation, storage, processing, and analysis. · Maintain current and develop the future-state case-based disease surveillance system architectural diagrams to ensure the system continues to be robust, scalable, and aligned with the SD-DOH's public health goals. · Facilitate and collaborate with end-users and technical staff to understand requirements and deliver effective surveillance system solutions. · Evaluate, design, and implement both the existing and proposed future-state disease surveillance system solutions. Ensure that the solutions meet the needs of all stakeholders within SD-DOH, including being adaptable to evolving challenges. This includes conducting an environmental scan of system requirements completed by other jurisdictions. · Document all aspects of the system architecture, including design decisions, data flow processes, and system requirements. Maintain clear and comprehensive records to support ongoing system maintenance and future upgrades. · A work product for this position includes creating a final requirements document with all necessary technical specifications for a future-state case-based surveillance system listed clearly and concisely. · Create and manage the systems and pipelines that enable efficient and reliable flow of data, including ingestion, processing, and storage. · Collect data from various sources, transforming and cleaning it to ensure accuracy and consistency. Load data into storage systems or data warehouses. · Optimize data pipelines, infrastructure, and workflows for performance and scalability. · Monitor data pipelines and systems for performance issues, errors, and anomalies, and implement solutions to address them. · Implement security measures to protect sensitive information. · Collaborate with data scientists, analysts, and other partners to understand their data needs and requirements, and to ensure that the data infrastructure supports the organization's goals and objectives. · Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs. · Implement and maintain ETL processes to ensure the accuracy, completeness, and consistency of data. · Design and manage data storage systems, including relational databases, NoSQL databases, and data warehouses. · Knowledgeable about industry trends, best practices, and emerging technologies in data engineering, and incorporating the trends into the organization's data infrastructure. · Provide technical guidance to other staff. · Work closely with a multidisciplinary team, including data content experts, analysts, data scientists, epidemiologists, IT staff, and other organizational personnel. · Communicate effectively with partners at all levels of the organization to gather requirements, provide updates, and present findings. Qualifications · Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. · 5 years of experience with project oversight, including communications to end-users and technical staff · Proficiency in programming languages commonly used in data engineering, such as Python, Java, Scala, or SQL. Candidate should be able to implement data automations within existing frameworks as opposed to writing one off scripts. · Experience with Microsoft Azure cloud technologies and frameworks or similar (AWS). · Strong understanding of database systems, including relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra). · Experience regarding engineering best practices such as source control, automated testing, continuous integration and deployment, and peer review. · Knowledge of data warehousing concepts and tools. · Experience with cloud computing platforms. · Expertise in data modeling, ETL (Extract, Transform, Load) processes, and data integration techniques. · Solid understanding of FHIR and API-based architectures. · Familiarity with agile development methodologies, software design patterns, and best practices. · Strong analytical thinking and problem-solving abilities. · Excellent verbal and written communication skills, including the ability to convey technical concepts to non-technical partners effectively. · Experience working with project management and tracking software (e.g., JIRA and DevOps) · Flexibility to adapt to evolving project requirements and priorities. · Outstanding interpersonal and teamwork skills; and the ability to develop productive working relationships with colleagues and partners. · Experience working in a virtual environment with remote partners and teams · Proficiency in Microsoft Office. Job Highlights · Location: Remote, must be based in the United States · Work Schedule: 8 am - 5 pm Central Time. Flexible work schedule of 1 hour before or after. · Salary Range: $103,500-$143,500 per year, plus benefits. Individual salary offers will be based on experience and qualifications unique to each candidate. · Position Type: Grant funded, limited-term opportunity · Position End Date: June 30, 2026 Special Notes This role is involved in a dynamic public health program. As such, roles and responsibilities are subject to change as situations evolve. Roles and responsibilities listed above may be expanded upon or updated to match priorities and needs, once written approval is received by the CDC Foundation in order to best support the public health programming. The CDC Foundation is a smoke-free environment. Relocation expenses are not included. All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of race, color, religion, sex, national origin, age, mental or physical disabilities, veteran status, and all other characteristics protected by law. We comply with all applicable laws including E.O. 11246 and the Vietnam Era Readjustment Assistance Act of 1974 governing employment practices and do not discriminate on the basis of any unlawful criteria in accordance with 41 C.F.R. §§ 60-300.5(a)(12) and 60-741.5(a)(7). As a federal government contractor, we take affirmative action on behalf of protected veterans. About the CDC Foundation The CDC Foundation helps CDC save and improve lives by unleashing the power of collaboration between CDC, philanthropies, corporations, organizations and individuals to protect the health, safety and security of America and the world. The CDC Foundation is the go-to nonprofit authorized by Congress to mobilize philanthropic partners and private-sector resources to support CDC's critical health protection mission. The CDC Foundation manages hundreds of programs each year impacting a variety of health threats from chronic disease conditions including cardiovascular disease and cancer, to infectious diseases like rotavirus and HIV, to emergency responses, including COVID-19 and Ebola. Visit ********************* for more information.
    $103.5k-143.5k yearly Auto-Apply 60d+ ago
  • Senior Data Architect

    Blue Cross Blue Shield of Minnesota 4.8company rating

    Eagan, MN jobs

    About Blue Cross and Blue Shield of Minnesota At Blue Cross and Blue Shield of Minnesota, we are committed to paving the way for everyone to achieve their healthiest life. We are looking for dedicated and motivated individuals who share our vision of transforming healthcare. As a Blue Cross associate, you are joining a culture that is built on values of succeeding together, finding a better way, and doing the right thing. If you are ready to make a difference, join us. The Impact You Will Have Assist and support the design and implementation of Enterprise Data Architecture for IT, which will be used in the development and deployment of technology driven business solutions for BCBSM and its business partners. Works with IT management, technology vendors, and customers to establish a strategic data technology direction with emphasis on improving the efficiency, effectiveness, integration, and quality of business solutions provided to our clients. Your Responsibilities Develops and maintains an Enterprise Data Architecture that articulates the principles, blueprints, and standards, across the data domains of Transaction, Interaction, and Analytic, which are used in the deployment of technology solutions for BCBSM customers. Develops and maintains an Enterprise Data Model (EDM) to serve as both the strategic and tactical planning vehicles to manage enterprise data. This effort involves working closely with the business data stewards by business divisions. Work with business driven project teams to ensure quality and compliance to the Enterprise Data Architecture by participating in data analysis/design activities and conducting appropriate technical data design reviews at various stages during the development life cycle. This includes providing data modeling expertise with both relational (i.e., entity relationship diagrams) and dimensional (i.e., star join schema) modeling techniques. Work with applications project teams to adopt a strong data reconciliation process concerning the replication of data that includes defining reconciliation architecture and conducting technical design reviews. Identifies and recommends specific infrastructure initiatives to further enhance the Enterprise Data Architecture Plan, aligning with the overall BCBSM business direction and IS strategies and coordinating with the annual IS planning/budget cycles. Guide and mentor IS staff in the analysis and selection of technology acquisitions, ensuring that new products conform and support the overall Enterprise Data Architecture strategy. Guide, educate, and mentor the Data Architecture Strategy directives, principles, and standards to individuals who play data-related roles (e.g., data analysts, data modelers, and business analysts). Required Skills and Experience 5+ years of related professional experience. All relevant experience including work, education, transferable skills, and military experience will be considered. Demonstrated strong skills in applying the data modeling techniques of relational (i.e., entity relationship diagrams) modeling and dimensional (i.e., star oin schema) modeling. Demonstrated human relation skills to effectively interact with peers, subordinates, internal and external customers and vendors. Demonstrated ability to influence and motivate individuals and teams. Advanced presentation skills and oral and written communication skills. Advanced technical knowledge of mainframe and client/server environments. Advanced analytical skills related to cost / benefit analysis of large dollar hardware and software implementations. High school diploma (or equivalency) and legal authorization to work in the U.S. Preferred Skills and Experience Bachelor's degree Experience with Power Designer and AWS Redshift a plus Role DesignationHybrid Anchored in Connection Our hybrid approach is designed to balance flexibility with meaningful in-person connection and collaboration. We come together in the office two days each week - most teams designate at least one anchor day to ensure team interaction. These in-person moments foster relationships, creativity, and alignment. The rest of the week you are empowered to work remote. Compensation and Benefits$100,000.00 - $135,000.00 - $170,000.00 Annual Pay is based on several factors which vary based on position, including skills, ability, and knowledge the selected individual is bringing to the specific job. We offer a comprehensive benefits package which may include: Medical, dental, and vision insurance Life insurance 401k Paid Time Off (PTO) Volunteer Paid Time Off (VPTO) And more To discover more about what we have to offer, please review our benefits page. Equal Employment Opportunity Statement At Blue Cross and Blue Shield of Minnesota, we are committed to paving the way for everyone to achieve their healthiest life. Blue Cross of Minnesota is an Equal Opportunity Employer and maintains an Affirmative Action plan, as required by Minnesota law applicable to state contractors. All qualified applications will receive consideration for employment without regard to, and will not be discriminated against based on any legally protected characteristic. Individuals with a disability who need a reasonable accommodation in order to apply, please contact us at: **********************************. Blue Cross and Blue Shield of Minnesota and Blue Plus are nonprofit independent licensees of the Blue Cross and Blue Shield Association.
    $100k-135k yearly Auto-Apply 60d+ ago
  • Data Platform Engineer

    Chicago Public Media 4.3company rating

    Chicago, IL jobs

    The Opportunity We're looking for a versatile Data Platform Engineer to join our Enterprise Systems team. This role is ideal for someone who thrives on solving complex data problems, enjoys working across systems, and can bridge the gap between infrastructure and analytics. You'll help design, build, and maintain the pipelines and tools that ensure our organizational data flows reliably and is ready for use in critical platforms like our CRM, ESP, donation systems, and analytics environments. You'll work alongside the Senior Engineering Manager who will help define the architecture and systems, while you execute, troubleshoot, and improve those systems with increasing ownership over time. General Responsibilities * Build and maintain ETL pipelines using Python and SQL to move data between internal systems and external platforms (e.g., CRMs, ESPs). * Ensure data reliability and integrity across systems and develop automated validation and alerting. * Troubleshoot issues across platforms and contribute to root cause analysis and long-term solutions. * Collaborate with team members and stakeholders to understand business requirements and translate them into scalable technical solutions. * Participate in monitoring, performance tuning, and refactoring of existing data flows. * Help maintain and extend internal tools that support data operations and stakeholder reporting needs. * Document systems, processes, and logic to ensure knowledge sharing and continuity. Qualifications * 3-5 years of experience in a data engineering, systems integration, or similar role. * Proficient in Python (Or Equivalent) and SQL, with experience building maintainable ETL pipelines. * Experience working with relational databases (e.g., SQL Server, Postgres). * Comfortable navigating and integrating with REST APIs and working with cloud-based services. * Experience working with CRM or ESP platforms (e.g., Salesforce, RevCRM, Marketing Cloud, EveryAction, etc.) is a plus. * Strong problem-solving skills with the ability to debug across systems. * Excellent communication and documentation habits. * Comfortable navigating and troubleshooting legacy pipelines and code, with a focus on understanding historical context before making changes. Compensation The expected pay range for this position is $101,810.00 - $120,000.00 per ANNUM. Chicago Public Media provides pay ranges representing its good faith estimate of what the organization reasonably expects to pay for a position. The pay offered to a selected candidate will be determined based on factors such as (but not limited to the scope and responsibilities of the position, the qualifications of the selected candidate, departmental budget availability, internal equity, geographical location, and external market pay for comparable jobs.
    $101.8k-120k yearly Auto-Apply 60d+ ago
  • Data Platform Engineer

    Chicago Public Media 4.3company rating

    Chicago, IL jobs

    Job Description Home to WBEZ and the Chicago Sun-Times, Chicago Public Media is the largest local non-profit news organization in the country. WBEZ and the Chicago Sun-Times serve more than 2 million people weekly across broadcast, print, and digital platforms. As a mission-driven organization, we aspire to become the essential and most trusted news source that Chicago turns to each day for understanding the people, events, and ideas that shape our community. WBEZ is home to local and national news programming as well as a growing portfolio of popular podcasts. WBEZ serves the community with fact-based, objective news and information, and its award-winning journalists ask tough questions, dig deep for answers and expose truths that spark change and foster understanding. WBEZ is supported by 77,000 members, hundreds of corporate sponsors and major donors. In 2024, Chicago Public Media won 33 national and local awards including from the National Headliner Awards, National Association for Black Journalists, Public Media Journalists Association, Radio Television Digital News Association, Chicago Headline Club, Chicago Journalists Association, Media for a Just Society, and the Richard H. Driehaus Foundation. Chicago Sun-Times is Chicago's oldest continuously published daily newspaper serving Chicago and is known for its hard-hitting investigative reporting, in-depth political coverage, timely behind-the-scenes sports analysis, and insightful entertainment and cultural coverage. Chicago Sun-Times is the winner of eight Pulitzer Prizes and countless other awards. In recent years, the Sun-Times has focused on a digital transformation to deliver its news and content to a growing digital audience. Most recently, the Sun-Times dropped the paywall on suntimes.com to expand access to its journalism, and shifted to a community-funded digital membership program supported by voluntary member donations. Chicago Public Media believes independent journalism is essential to a well-functioning democracy and access to fact-based, objective news and information is a right of every citizen. We serve the public interest by creating diverse, compelling content that informs, inspires, and enriches. We connect diverse audiences and help them make a difference in the community, the region and the world. And, we employ 300+ staff who want to belong to an organization that inspires, supports, and challenges them to do their best work. For more information, please see the Chicago Public Media Annual Report. The Opportunity We're looking for a versatile Data Platform Engineer to join our Enterprise Systems team. This role is ideal for someone who thrives on solving complex data problems, enjoys working across systems, and can bridge the gap between infrastructure and analytics. You'll help design, build, and maintain the pipelines and tools that ensure our organizational data flows reliably and is ready for use in critical platforms like our CRM, ESP, donation systems, and analytics environments. You'll work alongside the Senior Engineering Manager who will help define the architecture and systems, while you execute, troubleshoot, and improve those systems with increasing ownership over time. General Responsibilities Build and maintain ETL pipelines using Python and SQL to move data between internal systems and external platforms (e.g., CRMs, ESPs). Ensure data reliability and integrity across systems and develop automated validation and alerting. Troubleshoot issues across platforms and contribute to root cause analysis and long-term solutions. Collaborate with team members and stakeholders to understand business requirements and translate them into scalable technical solutions. Participate in monitoring, performance tuning, and refactoring of existing data flows. Help maintain and extend internal tools that support data operations and stakeholder reporting needs. Document systems, processes, and logic to ensure knowledge sharing and continuity. Qualifications 3-5 years of experience in a data engineering, systems integration, or similar role. Proficient in Python (Or Equivalent) and SQL, with experience building maintainable ETL pipelines. Experience working with relational databases (e.g., SQL Server, Postgres). Comfortable navigating and integrating with REST APIs and working with cloud-based services. Experience working with CRM or ESP platforms (e.g., Salesforce, RevCRM, Marketing Cloud, EveryAction, etc.) is a plus. Strong problem-solving skills with the ability to debug across systems. Excellent communication and documentation habits. Comfortable navigating and troubleshooting legacy pipelines and code, with a focus on understanding historical context before making changes. Compensation The expected pay range for this position is $101,810.00 - $120,000.00 per ANNUM. Chicago Public Media provides pay ranges representing its good faith estimate of what the organization reasonably expects to pay for a position. The pay offered to a selected candidate will be determined based on factors such as (but not limited to the scope and responsibilities of the position, the qualifications of the selected candidate, departmental budget availability, internal equity, geographical location, and external market pay for comparable jobs. Working at Chicago Public Media At Chicago Public Media, we care deeply about our employees as we know attracting, developing, and growing talent is key to our success and enhancing our impact. Our culture is one where collaboration, diversity of ideas, and innovation are encouraged. We value colleagues who will enhance our culture by bringing new ideas, divergent experiences, and talents to our dynamic workplace. At Chicago Public Media we believe dedication to a great workplace includes supporting our employees and their families. As a result, we provide a broad and generous benefits package for employees at hire and in the years to come. Our benefits include a competitive salary and benefits package which includes medical, dental, vision, vacation, holidays, life insurance, disability coverage, retirement savings, and a commuter benefits plan. Chicago Public Media is an Equal Opportunity Employer and we actively seek and welcome people from all backgrounds, orientations, and life experiences to join our team. The essential functions described above are not all-inclusive and are not intended to create any contractual or other legal commitment. Chicago Public Media may change the content or format of this job at any time in its sole and exclusive discretion without notice.
    $101.8k-120k yearly 20d ago
  • Data Scientist I

    Battelle Memorial Institute 4.7company rating

    Georgia jobs

    Battelle delivers when others can't. We conduct research and development, manage national laboratories, design and manufacture products and deliver critical services for our clients-whether they are a multi-national corporation, a small start-up or a government agency. We recognize and appreciate the value and contributions of individuals from a wide range of backgrounds and experiences and welcome all qualified individuals to apply. **Job Summary** The Health Research and Analytics (HRA) business line is seeking a highly motivated, full-time **Data Scientist** to join our team in support of our government customer, U.S. Special Operations Command. This position will play a critical role in advancing the Preservation of the Forces and Family (POTFF) program, which is dedicated to optimizing and sustaining the mission readiness, longevity, and performance of Special Operations Forces (SOF). Through integrated and holistic human performance initiatives, POTFF strengthens both the Forces and their families, ensuring comprehensive support for those who serve. As a Data Scientist, you will contribute to impactful research and analytics that drive evidence-based decision-making and enhance the effectiveness of these vital programs. This is an exciting opportunity to make a meaningful difference in the lives of SOF personnel and their families while working in a dynamic, mission-driven environment. This position is responsible for performing entry level analysis of program-related data, including building and maintaining databases and spreadsheets. The Data Scientist will collaborate with program staff and other stakeholders to identify effective data collection methods and provide guidance on data management. Additionally, the role involves preparing reports and presentations to communicate key findings and utilizing relevant systems to support data analysis and decision-making. **Responsibilities** + Enter, clean, and perform basic manipulation and analysis of POTFF programmatic data + Build, maintain, and disseminate databases and spreadsheets to record POTFF-related information + Collaborate with POTFF program staff and the Government's biostatistician to ensure accurate data management and analysis + Provide consultation and assistance to supported units and POTFF staff to identify opportunities and methods for effective data capture + Prepare reports and presentations that clearly communicate data trends and analysis results + Access and utilize Government systems to enter, manage, and analyze program data + Completes assigned project work within schedule and budget constraints **Key Qualifications** + Bachelor's degree in quantitative science, social science, or a related discipline + Proficiency with Microsoft Office programs, including Word, Excel, and Access + Basic proficiency with commonly used statistical software applications (e.g., SPSS, SAS, R) + Excellent written and verbal communication skills + Strong attention to detail and organizational skills + Proficiency may be demonstrated through relevant work history, publications, or a combination of both + Ability to effectively communicate orally and written + Ability to obtain and maintain a U.S. government security clearance **Benefits: Live an Extraordinary Life** We care about your well-being, not just on the job. Battelle offers comprehensive and competitive benefits to help you live your best life. + **Balance life through a compressed work schedule** : Most of our team follows a flexible, compressed work schedule that allows for every other Friday off-giving you a dedicated day to accomplish things in your personal life without using vacation time. + **Enjoy enhanced work flexibility, including a hybrid arrangement:** You have options for where and when you work. Our Together with Flexibility model allows you to work 60% in-office and 40% remote, with Monday and Tuesday as common in-office days, dependent on team and position needs. + **Take time to recharge** : You get paid time off to support work-life balance and keep motivated. + **Prioritize wellness** : Stay healthy with medical, dental, and vision coverage with wellness incentives and benefits plus a variety of optional supplemental benefits. + **Better together** : Coverage for partners, gender-affirming care and health support, and family formation support. + **Build your financial future** : Build financial stability with an industry-leading 401(k) retirement savings plan. For most employees, we put in 5 percent whether you contribute or not, and match your contributions on top of that. + **Advance your education** : Tuition assistance is available to pursue higher education. **A Work Environment Where You Succeed** For brilliant minds in science, technology, engineering and business operations, Battelle is the place to do the greatest good by solving humanity's most pressing challenges and creating a safer, healthier and more secure world. You will have the opportunity to thrive in a culture that inspires you to: + Apply your talent to challenging and meaningful projects + Receive select funding to pursue ideas in scientific and technological discovery + Partner with world-class experts in a collaborative environment + Nurture and develop the next generation of scientific leaders + Give back to and improve our communities **Vaccinations & Safety Protocols** _Battelle may require employees, based on job duties, work location, and/or its clients' requirements to follow certain safety protocols and to be vaccinated against a variety of viruses, bacteria, and diseases as a condition of employment and continued employment and to provide documentation that they are fully vaccinated. If applicable, Battelle will provide reasonable accommodations based on a qualified disability or medical condition through the Americans with Disabilities Act or the Rehabilitation Act or for a sincerely held religious belief under Title VII of the Civil Rights Act of 1964 (and related state laws)._ _Battelle is an equal opportunity employer. We provide employment and opportunities for advancement, compensation, training, and growth according to individual merit, without regard to race, color, religion, sex (including pregnancy), national origin, sexual orientation, gender identity or expression, marital status, age, genetic information, disability, veteran-status veteran or military status, or any other characteristic protected under applicable Federal, state, or local law. Our goal is for each staff member to have the opportunity to grow to the limits of their abilities and to achieve personal and organizational objectives. We will support positive programs for equal treatment of all staff and full utilization of all qualified employees at all levels within Battelle._ The above statements are intended to describe the nature and level of work being performed by people assigned to this job. They are not intended to be an exhaustive list of all responsibilities, activities and skills required of staff members. **No statement herein is intended to imply any authorities to commit Battelle unless special written permission is granted by Battelle's Legal Department.** For more information about our other openings, please visit ************************
    $80k-105k yearly est. 47d ago
  • Senior Data Engineer

    City Year 4.2company rating

    Remote

    Application Instructions Click Apply to submit your online application. Please attach a resume and thoughtful cover letter on the "My Experience" page in the "Resume/CV" field. Active City Year Staff members must login to Workday to apply internally. Number of Positions: 1Work Location: 100% Remote Position Overview The Senior Data Engineer works closely with local City Year data experts, school district IT professionals and a multitude of partners to manage ingestion of data from many sources. Our ideal candidate is professionally experienced in all things Azure and familiar with dev ops. They will use their Azure experience, especially with Azure Data Factories and Databricks, to lead the end-to-end development and implementation of modern ETL/ELT data pipelines for our team. This candidate will be excited to promote the effective use of timely and accurate k-12 education data and empower front-line practitioners with the information needed to have greater impact on the students we serve. The Senior Data Engineer reports to the Director of Data Management and Reporting. Job Description As a Senior Data Engineer at City Year, you will: Be our resident Azure expert and trusted advisor for Azure Own the design, development, and implementation of modern data pipelines, data factories and data streams. This is a hands-on role. Lead the planning and then implement the data platform services including sizing, configuration, and needs assessment Own the management and development of various third-party data integrations Lead development of frameworks, and data architectures for large-scale data processing that ensure timely and accurate processing of data into and among City Year's systems and help implement them. Influence and make recommendations on the technical direction for the team by leveraging your prior experiences and your knowledge of emerging technologies and approaches Lead our team in identifying and promoting data management best practices Conduct full technical discovery, identify pain points, gather business and technical requirements, and explain the “as is” and “to be” scenarios to fully understand user stories from stakeholders and users Lead and participate in deep architectural discussions to build confidence and ensure customer success when building new solutions and migrating existing data applications on the Azure platform Serve as a leader in bridging the gap between technical and non-technical staff in understanding our data and its processing at City Year. Own the implementation and use of ETL and data management best practices, devops, ci/cd, and deployment strategies to ensure product quality, agility, robustness, and recoverability Partner with districts and internal customers to understand their requirements and implement data solutions. Create integrations between internal systems and support them operationally. Teach others in the DMAR team about Azure, databricks and best practices to ensure all services and technology implemented can be supported by others on the team. You must have: At least 3+ years of professional experience (not simply coursework or capstone projects) in the following: Working in an Azure environment on ETL/ELT projects Azure DevOps Azure Databricks Azure Data Factories SQL Python Relational databases Working with heterogeneous datasets/formats from many vendors, providers, and platforms Data Architecture experience Have experience and demonstrated ability to successfully integrate and manage data across disparate information systems Excellent written and verbal communication skills, especially the ability to express complex technical information in understandable and rigorously accurate terms as well as to express technical items in a manner that non-technical users can understand Attention to detail while working on multiple projects at once Demonstrated success and effectiveness working in and promoting a rapidly changing, collaborative, and time-critical work environment Nice to have: Some experience with Databricks Lakehouse with Microsoft Fabric with archiving and backup solutions in Microsoft Azure in a role at a school district, Charter Management Organization (CMO), or Ed-tech company with an emphasis on supporting end-user data needs in an Agile environment Commitment to continuous improvement and City Year's mission: our desire to focus talents on helping improve outcomes for kids in school and our AmeriCorps members who support them You love learning new things. You're curious and ask good questions. You solicit feedback from others, accept it with grace, and act on it What we offer: Your technical skills will be used to have a significant, positive impact on children's education outcomes and future opportunities A role on a small, high-powered team that is integral to delivering on the mission of City Year Opportunity for control over the design and implementation of solutions Space and opportunity to develop new technical skills Focus on creating a work environment that is diverse, inclusive, equitable, and encourages belonging Opportunity to work with some amazing people! Benefits Full-time employees will be eligible for all benefits including vacation, sick days and organization holidays. You may participate in all benefit programs that City Year establishes and makes available to eligible employees, under (and subject to all provisions of) the plan documents that govern those programs. Currently, City Year offers medical, dental, vision, life, accidental death and dismemberment and disability coverage, Flexible Spending Accounts (FSA), and other benefits including 401(k) plan(s) pursuant to the terms and conditions of company policy and the 401(k) plan document. For more information, click here. Employment at City Year is at-will. City Year does not sponsor work authorization visas.
    $74k-95k yearly est. Auto-Apply 60d+ ago
  • M-4/1 - 4939 - Full Stack Engineer/Data Architect - Remote & Phoenix, AZ (LOCAL Candidates Only)

    FHR 3.6company rating

    Phoenix, AZ jobs

    ** Position is hybrid. Mainly remote but will need to come into the office in Phoenix, AZ periodically for meetings. Local to AZ Candidates only - no relocation allowed. Candidate MUST be able to attend an in-person interview in Phoenix, AZ. ** Our direct client has an opening for a Full Stack Engineer/Data Architect # 4939. This position is for 6-12+ months, with option of extension, and will be worked hybrid - mainly remote but will need to come into the office in Phoenix, AZ periodically for meetings. If you are interested, please submit the following: YOUR CURRENT RESUME YOUR HOURLY RATE Below is the job description - Resumes due ASAP - Description: The client is developing a centralized portal to serve Arizonans as a user-friendly entry point for available health and human services. This single point of entry will uniquely identify each individual, making it easier for Arizona residents to access prioritized services provided by state health and human service agencies. In support of this broad initiative, DHS is seeking assistance with the assessment, prioritization, and future state technical design of their most critical citizen-centric services. Description: The client is seeking a Full Stack Engineer/Data Architect with knowledge in the design, governance and implementation of the organization's data and integrating master data across all business units and systems. This role will require experience in MDM solutions, data modeling, integration and governance practices. The scope of work entails an understanding of the current MDM architecture and an assessment of planned MDM functionality, integration needs and desired architecture. Technical Experience: 5+ years experience in data architecture, MDM and data governance(e.g. Informatica, TAMR, SQL) 5+ years experience in application & integration services(e.g. Salesforce, .NET framework, ASP.NET, WebAPI, REST APIs, SOAP) 5+ years experience in programming and scripting languages(e.g. Python) 5+ years data management frameworks, data integration and ETL processes(e.g. Matillion, AWS Glue) 3+ years experience in data warehouse , data lakes and analytics platforms(e.g. Snowflake, Databricks) Experience in cloud platforms(e.g. Azure, AWS, Google Cloud) By replying to this job advertisement, I agree I want to receive additional job advertisements from FHR, including email, phone and mail to the contact information I am submitting. I consent to Focused HR Solutions, its affiliates, third parties and partners processing my personal data for these purposes and as described in the Privacy Policy. I understand that I can withdraw my consent at anytime. --
    $89k-127k yearly est. 25d ago
  • Sr Data Warehouse Lakehouse Developer

    Lumen 3.4company rating

    Tallahassee, FL jobs

    Lumen connects the world. We are igniting business growth by connecting people, data and applications - quickly, securely, and effortlessly. Together, we are building a culture and company from the people up - committed to teamwork, trust and transparency. People power progress. We're looking for top-tier talent and offer the flexibility you need to thrive and deliver lasting impact. Join us as we digitally connect the world and shape the future. **The Role** We are seeking a Senior Data Warehouse/Lakehouse Developer to design, build, and optimize enterprise data solutions. This role combines advanced development expertise with strong analytical skills to translate business requirements into scalable, high-performance data systems. You will work closely with architects, product owners, and scrum teams, provide technical leadership, and ensure best practices in data engineering and testing. **Location** The position is a Work-From-Home available from any US-based location. You must be a US Citizen or Permanent Resident/Green Card for consideration. **The Main Responsibilities** **Design & Development** + Develop and maintain ETL/ELT processes for Data Warehouse and Lakehouse environments. + Create and optimize complex SQL queries, stored procedures, and data transformations. + Build and enhance source-to-target mapping documents. + Assist with UAT build and data loading for User Acceptance Testing. + Estimate levels of effort (LOEs) for analysis, design, development, and testing tasks. **Technical Leadership** + Provide technical leadership and mentorship to team members. + Collaborate with architects, system engineers, and product owners to understand and detail business/system requirements and logical/physical data models. + Participate in and consult on integrated application and regression testing. + Conduct training sessions for system operators, programmers, and end users. **Analytical Expertise** + Analyze programming requests to ensure seamless integration with current applications. + Perform data analysis and mapping to ensure accuracy and consistency. + Generate test plans and test cases for quality assurance. + Research and evaluate problems, recommend solutions, and implement decisions. **Continuous Improvement** + Monitor and optimize data pipelines for performance and reliability. + Stay current with emerging technologies and recommend improvements to architecture and processes. + Adapt to changing priorities and aggressive project timelines while managing multiple complex projects. **What We Look For in a Candidate** **Technical Skills** + Proficiency in SQL and at least one programming language (Python, Java, Scala). + Experience with ETL tools (Informatica, Kafka) and Lakehouse technologies (Azure Data Factory, PySpark). + Familiarity with databases (Databricks, Oracle, SQL Server). + Knowledge of modeling tools (Visio, ERwin, UML) and data analysis tools (TOAD, Oracle SQL Developer, DBeaver). + Strong understanding of data warehousing concepts and Lakehouse architecture. **Analytical & Problem-Solving** + Ability to translate business requirements into technical solutions. + Strong troubleshooting and performance tuning skills. + Demonstrated organizational, oral, and written communication skills. **Experience** + 6+ years of experience with a Bachelor's degree OR 4+ years with a Master's degree. + Proven ability to lead technical teams and manage projects. + Experience in applications development and systems analysis. **Preferred Qualifications** + Project management experience. + Familiarity with CI/CD pipelines and version control (Git). + Exposure to big data frameworks (Spark, Hadoop) and cloud ecosystems (Azure, AWS, GCP). **Compensation** This information reflects the anticipated base salary range for this position based on current national data. Minimums and maximums may vary based on location. Individual pay is based on skills, experience and other relevant factors. Location Based Pay Ranges $82,969 - $110,625 in these states: AL AR AZ FL GA IA ID IN KS KY LA ME MO MS MT ND NE NM OH OK PA SC SD TN UT VT WI WV WY $87,117 - $116,156 in these states: CO HI MI MN NC NH NV OR RI $91,266 - $121,688 in these states: AK CA CT DC DE IL MA MD NJ NY TX VA WA Lumen offers a comprehensive package featuring a broad range of Health, Life, Voluntary Lifestyle benefits and other perks that enhance your physical, mental, emotional and financial wellbeing. We're able to answer any additional questions you may have about our bonus structure (short-term incentives, long-term incentives and/or sales compensation) as you move through the selection process. Learn more about Lumen's: Benefits (**************************************************** Bonus Structure \#LI-Remote \#LI-PS Requisition #: 340407 **Background Screening** If you are selected for a position, there will be a background screen, which may include checks for criminal records and/or motor vehicle reports and/or drug screening, depending on the position requirements. For more information on these checks, please refer to the Post Offer section of our FAQ page (************************************* . Job-related concerns identified during the background screening may disqualify you from the new position or your current role. Background results will be evaluated on a case-by-case basis. Pursuant to the San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records. **Equal Employment Opportunities** We are committed to providing equal employment opportunities to all persons regardless of race, color, ancestry, citizenship, national origin, religion, veteran status, disability, genetic characteristic or information, age, gender, sexual orientation, gender identity, gender expression, marital status, family status, pregnancy, or other legally protected status (collectively, "protected statuses"). We do not tolerate unlawful discrimination in any employment decisions, including recruiting, hiring, compensation, promotion, benefits, discipline, termination, job assignments or training. **Disclaimer** The job responsibilities described above indicate the general nature and level of work performed by employees within this classification. It is not intended to include a comprehensive inventory of all duties and responsibilities for this job. Job duties and responsibilities are subject to change based on evolving business needs and conditions. In any materials you submit, you may redact or remove age-identifying information such as age, date of birth, or dates of school attendance or graduation. You will not be penalized for redacting or removing this information. Please be advised that Lumen does not require any form of payment from job applicants during the recruitment process. All legitimate job openings will be posted on our official website or communicated through official company email addresses. If you encounter any job offers that request payment in exchange for employment at Lumen, they are not for employment with us, but may relate to another company with a similar name.
    $91.3k-121.7k yearly 16d ago
  • Sr Data Warehouse Lakehouse Developer

    Lumen 3.4company rating

    Des Moines, IA jobs

    Lumen connects the world. We are igniting business growth by connecting people, data and applications - quickly, securely, and effortlessly. Together, we are building a culture and company from the people up - committed to teamwork, trust and transparency. People power progress. We're looking for top-tier talent and offer the flexibility you need to thrive and deliver lasting impact. Join us as we digitally connect the world and shape the future. **The Role** We are seeking a Senior Data Warehouse/Lakehouse Developer to design, build, and optimize enterprise data solutions. This role combines advanced development expertise with strong analytical skills to translate business requirements into scalable, high-performance data systems. You will work closely with architects, product owners, and scrum teams, provide technical leadership, and ensure best practices in data engineering and testing. **Location** The position is a Work-From-Home available from any US-based location. You must be a US Citizen or Permanent Resident/Green Card for consideration. **The Main Responsibilities** **Design & Development** + Develop and maintain ETL/ELT processes for Data Warehouse and Lakehouse environments. + Create and optimize complex SQL queries, stored procedures, and data transformations. + Build and enhance source-to-target mapping documents. + Assist with UAT build and data loading for User Acceptance Testing. + Estimate levels of effort (LOEs) for analysis, design, development, and testing tasks. **Technical Leadership** + Provide technical leadership and mentorship to team members. + Collaborate with architects, system engineers, and product owners to understand and detail business/system requirements and logical/physical data models. + Participate in and consult on integrated application and regression testing. + Conduct training sessions for system operators, programmers, and end users. **Analytical Expertise** + Analyze programming requests to ensure seamless integration with current applications. + Perform data analysis and mapping to ensure accuracy and consistency. + Generate test plans and test cases for quality assurance. + Research and evaluate problems, recommend solutions, and implement decisions. **Continuous Improvement** + Monitor and optimize data pipelines for performance and reliability. + Stay current with emerging technologies and recommend improvements to architecture and processes. + Adapt to changing priorities and aggressive project timelines while managing multiple complex projects. **What We Look For in a Candidate** **Technical Skills** + Proficiency in SQL and at least one programming language (Python, Java, Scala). + Experience with ETL tools (Informatica, Kafka) and Lakehouse technologies (Azure Data Factory, PySpark). + Familiarity with databases (Databricks, Oracle, SQL Server). + Knowledge of modeling tools (Visio, ERwin, UML) and data analysis tools (TOAD, Oracle SQL Developer, DBeaver). + Strong understanding of data warehousing concepts and Lakehouse architecture. **Analytical & Problem-Solving** + Ability to translate business requirements into technical solutions. + Strong troubleshooting and performance tuning skills. + Demonstrated organizational, oral, and written communication skills. **Experience** + 6+ years of experience with a Bachelor's degree OR 4+ years with a Master's degree. + Proven ability to lead technical teams and manage projects. + Experience in applications development and systems analysis. **Preferred Qualifications** + Project management experience. + Familiarity with CI/CD pipelines and version control (Git). + Exposure to big data frameworks (Spark, Hadoop) and cloud ecosystems (Azure, AWS, GCP). **Compensation** This information reflects the anticipated base salary range for this position based on current national data. Minimums and maximums may vary based on location. Individual pay is based on skills, experience and other relevant factors. Location Based Pay Ranges $82,969 - $110,625 in these states: AL AR AZ FL GA IA ID IN KS KY LA ME MO MS MT ND NE NM OH OK PA SC SD TN UT VT WI WV WY $87,117 - $116,156 in these states: CO HI MI MN NC NH NV OR RI $91,266 - $121,688 in these states: AK CA CT DC DE IL MA MD NJ NY TX VA WA Lumen offers a comprehensive package featuring a broad range of Health, Life, Voluntary Lifestyle benefits and other perks that enhance your physical, mental, emotional and financial wellbeing. We're able to answer any additional questions you may have about our bonus structure (short-term incentives, long-term incentives and/or sales compensation) as you move through the selection process. Learn more about Lumen's: Benefits (**************************************************** Bonus Structure \#LI-Remote \#LI-PS Requisition #: 340407 **Background Screening** If you are selected for a position, there will be a background screen, which may include checks for criminal records and/or motor vehicle reports and/or drug screening, depending on the position requirements. For more information on these checks, please refer to the Post Offer section of our FAQ page (************************************* . Job-related concerns identified during the background screening may disqualify you from the new position or your current role. Background results will be evaluated on a case-by-case basis. Pursuant to the San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records. **Equal Employment Opportunities** We are committed to providing equal employment opportunities to all persons regardless of race, color, ancestry, citizenship, national origin, religion, veteran status, disability, genetic characteristic or information, age, gender, sexual orientation, gender identity, gender expression, marital status, family status, pregnancy, or other legally protected status (collectively, "protected statuses"). We do not tolerate unlawful discrimination in any employment decisions, including recruiting, hiring, compensation, promotion, benefits, discipline, termination, job assignments or training. **Disclaimer** The job responsibilities described above indicate the general nature and level of work performed by employees within this classification. It is not intended to include a comprehensive inventory of all duties and responsibilities for this job. Job duties and responsibilities are subject to change based on evolving business needs and conditions. In any materials you submit, you may redact or remove age-identifying information such as age, date of birth, or dates of school attendance or graduation. You will not be penalized for redacting or removing this information. Please be advised that Lumen does not require any form of payment from job applicants during the recruitment process. All legitimate job openings will be posted on our official website or communicated through official company email addresses. If you encounter any job offers that request payment in exchange for employment at Lumen, they are not for employment with us, but may relate to another company with a similar name.
    $91.3k-121.7k yearly 16d ago
  • Sr Data Warehouse Lakehouse Developer

    Lumen 3.4company rating

    Springfield, IL jobs

    Lumen connects the world. We are igniting business growth by connecting people, data and applications - quickly, securely, and effortlessly. Together, we are building a culture and company from the people up - committed to teamwork, trust and transparency. People power progress. We're looking for top-tier talent and offer the flexibility you need to thrive and deliver lasting impact. Join us as we digitally connect the world and shape the future. **The Role** We are seeking a Senior Data Warehouse/Lakehouse Developer to design, build, and optimize enterprise data solutions. This role combines advanced development expertise with strong analytical skills to translate business requirements into scalable, high-performance data systems. You will work closely with architects, product owners, and scrum teams, provide technical leadership, and ensure best practices in data engineering and testing. **Location** The position is a Work-From-Home available from any US-based location. You must be a US Citizen or Permanent Resident/Green Card for consideration. **The Main Responsibilities** **Design & Development** + Develop and maintain ETL/ELT processes for Data Warehouse and Lakehouse environments. + Create and optimize complex SQL queries, stored procedures, and data transformations. + Build and enhance source-to-target mapping documents. + Assist with UAT build and data loading for User Acceptance Testing. + Estimate levels of effort (LOEs) for analysis, design, development, and testing tasks. **Technical Leadership** + Provide technical leadership and mentorship to team members. + Collaborate with architects, system engineers, and product owners to understand and detail business/system requirements and logical/physical data models. + Participate in and consult on integrated application and regression testing. + Conduct training sessions for system operators, programmers, and end users. **Analytical Expertise** + Analyze programming requests to ensure seamless integration with current applications. + Perform data analysis and mapping to ensure accuracy and consistency. + Generate test plans and test cases for quality assurance. + Research and evaluate problems, recommend solutions, and implement decisions. **Continuous Improvement** + Monitor and optimize data pipelines for performance and reliability. + Stay current with emerging technologies and recommend improvements to architecture and processes. + Adapt to changing priorities and aggressive project timelines while managing multiple complex projects. **What We Look For in a Candidate** **Technical Skills** + Proficiency in SQL and at least one programming language (Python, Java, Scala). + Experience with ETL tools (Informatica, Kafka) and Lakehouse technologies (Azure Data Factory, PySpark). + Familiarity with databases (Databricks, Oracle, SQL Server). + Knowledge of modeling tools (Visio, ERwin, UML) and data analysis tools (TOAD, Oracle SQL Developer, DBeaver). + Strong understanding of data warehousing concepts and Lakehouse architecture. **Analytical & Problem-Solving** + Ability to translate business requirements into technical solutions. + Strong troubleshooting and performance tuning skills. + Demonstrated organizational, oral, and written communication skills. **Experience** + 6+ years of experience with a Bachelor's degree OR 4+ years with a Master's degree. + Proven ability to lead technical teams and manage projects. + Experience in applications development and systems analysis. **Preferred Qualifications** + Project management experience. + Familiarity with CI/CD pipelines and version control (Git). + Exposure to big data frameworks (Spark, Hadoop) and cloud ecosystems (Azure, AWS, GCP). **Compensation** This information reflects the anticipated base salary range for this position based on current national data. Minimums and maximums may vary based on location. Individual pay is based on skills, experience and other relevant factors. Location Based Pay Ranges $82,969 - $110,625 in these states: AL AR AZ FL GA IA ID IN KS KY LA ME MO MS MT ND NE NM OH OK PA SC SD TN UT VT WI WV WY $87,117 - $116,156 in these states: CO HI MI MN NC NH NV OR RI $91,266 - $121,688 in these states: AK CA CT DC DE IL MA MD NJ NY TX VA WA Lumen offers a comprehensive package featuring a broad range of Health, Life, Voluntary Lifestyle benefits and other perks that enhance your physical, mental, emotional and financial wellbeing. We're able to answer any additional questions you may have about our bonus structure (short-term incentives, long-term incentives and/or sales compensation) as you move through the selection process. Learn more about Lumen's: Benefits (**************************************************** Bonus Structure \#LI-Remote \#LI-PS Requisition #: 340407 **Background Screening** If you are selected for a position, there will be a background screen, which may include checks for criminal records and/or motor vehicle reports and/or drug screening, depending on the position requirements. For more information on these checks, please refer to the Post Offer section of our FAQ page (************************************* . Job-related concerns identified during the background screening may disqualify you from the new position or your current role. Background results will be evaluated on a case-by-case basis. Pursuant to the San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records. **Equal Employment Opportunities** We are committed to providing equal employment opportunities to all persons regardless of race, color, ancestry, citizenship, national origin, religion, veteran status, disability, genetic characteristic or information, age, gender, sexual orientation, gender identity, gender expression, marital status, family status, pregnancy, or other legally protected status (collectively, "protected statuses"). We do not tolerate unlawful discrimination in any employment decisions, including recruiting, hiring, compensation, promotion, benefits, discipline, termination, job assignments or training. **Disclaimer** The job responsibilities described above indicate the general nature and level of work performed by employees within this classification. It is not intended to include a comprehensive inventory of all duties and responsibilities for this job. Job duties and responsibilities are subject to change based on evolving business needs and conditions. In any materials you submit, you may redact or remove age-identifying information such as age, date of birth, or dates of school attendance or graduation. You will not be penalized for redacting or removing this information. Please be advised that Lumen does not require any form of payment from job applicants during the recruitment process. All legitimate job openings will be posted on our official website or communicated through official company email addresses. If you encounter any job offers that request payment in exchange for employment at Lumen, they are not for employment with us, but may relate to another company with a similar name.
    $91.3k-121.7k yearly 16d ago
  • Data Architect

    Feed The Children 4.1company rating

    Atlanta, GA jobs

    At Feed the Children, we recognize the value of outstanding people, and we are looking for compassionate changemakers to join our team. We pride ourselves on cultivating a collaborative workplace where employees experience productive and rewarding employment and feel engaged in our mission to end childhood hunger. Our passionate team shares a deep sense of purpose, and we dream big to solve complex problems and create positive impact in communities around the world. Feed the Children is recognized by Candid with its Platinum Seal of Transparency and is accredited by the BBB Wise Giving Alliance. The organization has received a 4-star rating from Charity Navigator and is consistently recognized on the Forbes Top 100 Charities list. We are currently in search of a Data Architect to join our Information Technology team! The Data Architect is responsible for designing and managing Feed the Children's modern cloud data infrastructure, with ownership of Microsoft Fabric and Azure, and enablement of Purview, Power Platform, Copilot, and Dynamics 365 integration. This role will lead the stand up and evolution of Feed the Children's Fabric platform and serve as its primary Product Owner, ensuring it is scalable, governed, secure, and AI-ready. The Data Architect will also collaborate closely with ERP and CRM teams, and coordinate with Data Governance leadership to align data architecture with Microsoft Purview and governance policies. The ideal candidate combines deep technical expertise with strategic thinking, strong collaboration, and leadership skills. This position will report directly to the Vice President of Business Intelligence. Salary range: $115K-$120K (commensurate with experience) Note: Although our corporate office in located in Oklahoma City, OK qualified candidates are being considered nationwide for this remote opportunity. Job Requirements: Education Bachelor's or Master's degree in Information Technology, Computer Science, Data Science, Information Systems, or a related technical field, preferred. Experience 7+ years' of related professional experience; 3+ years' experience in data architecture or enterprise data engineering; and 2+ years' experience working with Azure data services (e.g., Data Factory, Synapse, Data Lake). Deep understanding of: Microsoft Fabric, Dynamics 365 (especially SCM and CS), Power Platform, Power BI, Dataverse, Purview, SQL, Python, Spark, DAX, ETL/ELT processes, data modeling, Copilot, and Git and CI/CD practices. Familiarity with data retention, auditability, and regulatory compliance (e.g., GDPR, HIPAA, CCPA). Experience with Agile/Scrum methodologies, and Product Ownership. Any combination of education, training and experience which provides the required knowledge, skills and abilities to perform the essential functions of this job may be considered. Bonus Qualifications: * Experience working in a mission-driven enterprise, especially in global health and development, with complex supply chain, community impact, donation, and volunteering programs. Licenses and Certifications Credentials in relevant Microsoft technologies a plus. Essential Functions: Data Strategy & Architecture: * Define and maintain the enterprise data architecture roadmap, aligning with business goals and future scalability. * Design and implement robust architecture in Microsoft Fabric, including lakehouses, warehouses, Notebooks, pipelines, and semantic models. * Lead the development of data models, ETL/ELT processes and data lake/warehouse structures to support analytics and AI. * Manage all Azure resources, and oversee digital transformation necessary between Azure and Fabric, and from server to cloud architecture. * Lead the establishment and documentation of technology direction and standards for data platforms, involving all aspects of information access and retrieval, integration, middleware translators, utilities, tools, and languages for use by information technology groups. * Establish and enforce data architecture standards, including modeling conventions, naming standards, and documentation practices. * Continuously assess and optimize data architecture to improve quality, performance, scalability, cost management, and efficiency. Data Integration & Interoperability: * Ensure seamless integration of data and data flows from various sources, including ERP systems, CRM systems, DHIS2, external datasets and APIs, and other business applications, with the support of data management and governance leads. Data Governance & Management: * Following the direction of enterprise data management and governance councils, implement processes and tools to ensure high data quality and consistency across the organization. * Align with governance policies established in Microsoft Purview to govern, protect, and manage Feed the Children's data estate, ensuring compliance and risk management. * Following the direction of enterprise data management councils, implement Master Data Management policies across data architecture to create a common view of master data and provide a centralized mechanism for its aggregation, cleansing, transformation, augmentation, validation, syndication, and access. Managed File Transfer: * Support Managed File Transfer processes, in accordance with security and data governance principles. AI & Copilot Enablement: * In collaboration with AI developers and leaders, enable data architecture to support AI agent development and Copilot experiences. * Support the integration of AI agents, prompt engineering, and Azure OpenAI services into data workflows. Collaboration: * Work across all departments and business leaders to understand data needs and provide tailored, scalable solutions. * Collaborate with developers, analysts, and contractors to ensure alignment with architectural standards and business goals. * Participate in business intelligence and analytics initiatives, ensuring data solutions meet stakeholder needs. Security: * Implement best practices for data protection and privacy, ensuring compliance with regulatory requirements (e.g., GDPR, HIPAA). * Collaborate with security and compliance teams to align data practices with enterprise risk policies. * Implement role-based access control (RBAC) and encryption. Project Management: * Lead and manage architecture-related projects, including timelines, budgets, and resource allocation. * Provide architectural oversight throughout project cycles to ensure development of efficient data systems utilizing established standards, procedures and methodologies. * Manage projects implemented in collaboration with/by vendors and partners, including managing contractual and project management agreements and compliance. Reporting & Communication: * Generate and present reports on data usage, performance, and compliance. * Communicate architectural decisions, roadmaps, and standards to technical and non-technical audiences. Establish an environment of high performance and continuous improvement that values learning, a commitment to quality, welcomes and encourages collaboration, and fosters both intra and inter-departmental dialogue and respect. Model the type and level of behavior, professionalism and leadership that is in accordance with the values of the organization. Perform other related duties as required. About Feed the Children: As a leading anti-hunger organization, Feed the Children is committed to ending childhood hunger. We provide children and families in the U.S. and around the world with the food and essentials kids need to grow and thrive. Through our programs and partnerships, we feed children today while helping their families and communities build resilient futures. In addition to food, we distribute household and personal care items across the United States to help parents and caregivers maintain stable, food-secure households. Internationally, we expand access to nutritious meals, safe water, improved hygiene, and training in sustainable living. Responsible stewards of our resources, we are driven to pursue innovative, holistic, and child-focused solutions to the complex challenges of hunger, food insecurity, and poverty. For children everywhere, we believe that having enough to eat is a fundamental right. Our Values: We are driven by a shared sense of PURPOSE. At Feed the Children, our commitment to the mission is at the heart of what we do and fuels our collective impact in the communities where we serve. We cannot achieve our bold vision without our talented PEOPLE . We are passionate about fostering a best-in-class workforce that is engaged, respected, and empowered to deliver results. We believe in CURIOSITY and continued learning. Success requires a culture of discovery, curiosity and continued learning to expand our knowledge, seek new perspectives and challenge the status quo. We know COLLABORATION is the only way to end childhood hunger. We cannot succeed alone. It will take all of us - our employees, donors, partners, volunteers - working together to accomplish our ambitious goals. We DREAM big . When we work together, we collectively reimagine what is possible. We dream big to solve complex problems and create deep impact in communities around the world. We VALUE every donor. We respect our donors' intentions and promote responsible stewardship of the resources they entrust to us. Join Feed the Children and help create a world where no child goes to bed hungry. In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification form upon hire. Feed the Children is an equal opportunity employer. All qualified candidates will receive consideration for positions without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, political affiliation, pregnancy, military and/or veterans' status, genetic characteristics, marital status or any other considerations made unlawful by applicable state, federal, or local law. Feed the Children welcomes and encourages applications from persons with physical and mental disabilities and will make every effort to reasonably accommodate the needs of those persons. Additionally, Feed the Children strives to provide an environment free from sexual exploitation and abuse and harassment in all places where relief and development programs are implemented. Feed the Children expects its employees to maintain high ethical standards, protect organizational integrity and reputation, and ensure that Feed the Children work is carried out in honest and fair methods, in alignment with the Feed the Children safeguarding and associated policies.
    $115k-120k yearly 4d ago
  • Data Architect

    Feed The Children 4.1company rating

    Chicago, IL jobs

    At Feed the Children, we recognize the value of outstanding people, and we are looking for compassionate changemakers to join our team. We pride ourselves on cultivating a collaborative workplace where employees experience productive and rewarding employment and feel engaged in our mission to end childhood hunger. Our passionate team shares a deep sense of purpose, and we dream big to solve complex problems and create positive impact in communities around the world. Feed the Children is recognized by Candid with its Platinum Seal of Transparency and is accredited by the BBB Wise Giving Alliance. The organization has received a 4-star rating from Charity Navigator and is consistently recognized on the Forbes Top 100 Charities list. We are currently in search of a Data Architect to join our Information Technology team! The Data Architect is responsible for designing and managing Feed the Children's modern cloud data infrastructure, with ownership of Microsoft Fabric and Azure, and enablement of Purview, Power Platform, Copilot, and Dynamics 365 integration. This role will lead the stand up and evolution of Feed the Children's Fabric platform and serve as its primary Product Owner, ensuring it is scalable, governed, secure, and AI-ready. The Data Architect will also collaborate closely with ERP and CRM teams, and coordinate with Data Governance leadership to align data architecture with Microsoft Purview and governance policies. The ideal candidate combines deep technical expertise with strategic thinking, strong collaboration, and leadership skills. This position will report directly to the Vice President of Business Intelligence. Salary range: $115K-$120K (commensurate with experience) Note: Although our corporate office in located in Oklahoma City, OK qualified candidates are being considered nationwide for this remote opportunity. Job Requirements: Education Bachelor's or Master's degree in Information Technology, Computer Science, Data Science, Information Systems, or a related technical field, preferred. Experience 7+ years' of related professional experience; 3+ years' experience in data architecture or enterprise data engineering; and 2+ years' experience working with Azure data services (e.g., Data Factory, Synapse, Data Lake). Deep understanding of: Microsoft Fabric, Dynamics 365 (especially SCM and CS), Power Platform, Power BI, Dataverse, Purview, SQL, Python, Spark, DAX, ETL/ELT processes, data modeling, Copilot, and Git and CI/CD practices. Familiarity with data retention, auditability, and regulatory compliance (e.g., GDPR, HIPAA, CCPA). Experience with Agile/Scrum methodologies, and Product Ownership. Any combination of education, training and experience which provides the required knowledge, skills and abilities to perform the essential functions of this job may be considered. Bonus Qualifications: * Experience working in a mission-driven enterprise, especially in global health and development, with complex supply chain, community impact, donation, and volunteering programs. Licenses and Certifications Credentials in relevant Microsoft technologies a plus. Essential Functions: Data Strategy & Architecture: * Define and maintain the enterprise data architecture roadmap, aligning with business goals and future scalability. * Design and implement robust architecture in Microsoft Fabric, including lakehouses, warehouses, Notebooks, pipelines, and semantic models. * Lead the development of data models, ETL/ELT processes and data lake/warehouse structures to support analytics and AI. * Manage all Azure resources, and oversee digital transformation necessary between Azure and Fabric, and from server to cloud architecture. * Lead the establishment and documentation of technology direction and standards for data platforms, involving all aspects of information access and retrieval, integration, middleware translators, utilities, tools, and languages for use by information technology groups. * Establish and enforce data architecture standards, including modeling conventions, naming standards, and documentation practices. * Continuously assess and optimize data architecture to improve quality, performance, scalability, cost management, and efficiency. Data Integration & Interoperability: * Ensure seamless integration of data and data flows from various sources, including ERP systems, CRM systems, DHIS2, external datasets and APIs, and other business applications, with the support of data management and governance leads. Data Governance & Management: * Following the direction of enterprise data management and governance councils, implement processes and tools to ensure high data quality and consistency across the organization. * Align with governance policies established in Microsoft Purview to govern, protect, and manage Feed the Children's data estate, ensuring compliance and risk management. * Following the direction of enterprise data management councils, implement Master Data Management policies across data architecture to create a common view of master data and provide a centralized mechanism for its aggregation, cleansing, transformation, augmentation, validation, syndication, and access. Managed File Transfer: * Support Managed File Transfer processes, in accordance with security and data governance principles. AI & Copilot Enablement: * In collaboration with AI developers and leaders, enable data architecture to support AI agent development and Copilot experiences. * Support the integration of AI agents, prompt engineering, and Azure OpenAI services into data workflows. Collaboration: * Work across all departments and business leaders to understand data needs and provide tailored, scalable solutions. * Collaborate with developers, analysts, and contractors to ensure alignment with architectural standards and business goals. * Participate in business intelligence and analytics initiatives, ensuring data solutions meet stakeholder needs. Security: * Implement best practices for data protection and privacy, ensuring compliance with regulatory requirements (e.g., GDPR, HIPAA). * Collaborate with security and compliance teams to align data practices with enterprise risk policies. * Implement role-based access control (RBAC) and encryption. Project Management: * Lead and manage architecture-related projects, including timelines, budgets, and resource allocation. * Provide architectural oversight throughout project cycles to ensure development of efficient data systems utilizing established standards, procedures and methodologies. * Manage projects implemented in collaboration with/by vendors and partners, including managing contractual and project management agreements and compliance. Reporting & Communication: * Generate and present reports on data usage, performance, and compliance. * Communicate architectural decisions, roadmaps, and standards to technical and non-technical audiences. Establish an environment of high performance and continuous improvement that values learning, a commitment to quality, welcomes and encourages collaboration, and fosters both intra and inter-departmental dialogue and respect. Model the type and level of behavior, professionalism and leadership that is in accordance with the values of the organization. Perform other related duties as required. About Feed the Children: As a leading anti-hunger organization, Feed the Children is committed to ending childhood hunger. We provide children and families in the U.S. and around the world with the food and essentials kids need to grow and thrive. Through our programs and partnerships, we feed children today while helping their families and communities build resilient futures. In addition to food, we distribute household and personal care items across the United States to help parents and caregivers maintain stable, food-secure households. Internationally, we expand access to nutritious meals, safe water, improved hygiene, and training in sustainable living. Responsible stewards of our resources, we are driven to pursue innovative, holistic, and child-focused solutions to the complex challenges of hunger, food insecurity, and poverty. For children everywhere, we believe that having enough to eat is a fundamental right. Our Values: We are driven by a shared sense of PURPOSE. At Feed the Children, our commitment to the mission is at the heart of what we do and fuels our collective impact in the communities where we serve. We cannot achieve our bold vision without our talented PEOPLE . We are passionate about fostering a best-in-class workforce that is engaged, respected, and empowered to deliver results. We believe in CURIOSITY and continued learning. Success requires a culture of discovery, curiosity and continued learning to expand our knowledge, seek new perspectives and challenge the status quo. We know COLLABORATION is the only way to end childhood hunger. We cannot succeed alone. It will take all of us - our employees, donors, partners, volunteers - working together to accomplish our ambitious goals. We DREAM big . When we work together, we collectively reimagine what is possible. We dream big to solve complex problems and create deep impact in communities around the world. We VALUE every donor. We respect our donors' intentions and promote responsible stewardship of the resources they entrust to us. Join Feed the Children and help create a world where no child goes to bed hungry. In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification form upon hire. Feed the Children is an equal opportunity employer. All qualified candidates will receive consideration for positions without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, political affiliation, pregnancy, military and/or veterans' status, genetic characteristics, marital status or any other considerations made unlawful by applicable state, federal, or local law. Feed the Children welcomes and encourages applications from persons with physical and mental disabilities and will make every effort to reasonably accommodate the needs of those persons. Additionally, Feed the Children strives to provide an environment free from sexual exploitation and abuse and harassment in all places where relief and development programs are implemented. Feed the Children expects its employees to maintain high ethical standards, protect organizational integrity and reputation, and ensure that Feed the Children work is carried out in honest and fair methods, in alignment with the Feed the Children safeguarding and associated policies.
    $115k-120k yearly 4d ago
  • Business Intelligence Engineer

    System One 4.6company rating

    Melbourne, FL jobs

    System One is looking for a Business Intelligence Engineer. As a Business Intelligence Engineer, you will lead the design and delivery of robust data pipelines and scalable data?warehouse solutions that power actionable insights. Apply expertise in data architecture, modeling, mining, and integrity practices to transform complex data into clear, reliable analytics. As a senior member of the BI team, mentor colleagues on emerging technologies, champion best practices, and build systems for data collection, cleansing, and analysis-empowering smarter decisions across the organization. Principal Duties and Responsibilities: + Professional proficiency in ETL/ELT, SQL Server, SSMS, SSRS, SSIS, and Oracle IDE environments. + Professional proficiency in Microsoft Azure enterprise environment, including MS Azure Data Factory, MS Azure Data Bricks, and MS Azure Synapse. + Professional proficiency in MS Visual Studio. + Specify, design, build, and support data warehousing and other BI reporting and data storage solutions. + Monitors and tunes BI tools to ensure optimal efficiencies and performance metrics. + Supports upgrades, configuration, and trouble-shooting for various Business Intelligence tools. + Development and maintenance of multi-dimensional (OLAP) reporting databases. + Responsible for program design, coding, testing, debugging, and documentation of all BI systems. Minimum Qualifications: To perform this job successfully, an individual must to able to perform each essential task or duty satisfactorily. The requirements listed are representative of the minimum level of knowledge, skills, and abilities required to be successful in this role. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of this role. Education and Training: Four-year secondary degree required in Computer Science, Information Technology, Finance, Mathematics, or other related field is preferred, however, an equivalent combination of education and relevant knowledge or experience may be considered. Prior Experience: 5 - 7 years of experience required in any combination of the following: MS Azure suite of data reporting and integration, database development, reporting or analytics in an enterprise environment, Microsoft SQL coding language, Oracle database experience, or the development and maintenance of analytics reporting in an enterprise environment. Equivalence may be considered for verified coursework studies, but experience is preferred. System One, and its subsidiaries including Joulé, ALTA IT Services, and Mountain Ltd., are leaders in delivering outsourced services and workforce solutions across North America. We help clients get work done more efficiently and economically, without compromising quality. System One not only serves as a valued partner for our clients, but we offer eligible employees health and welfare benefits coverage options including medical, dental, vision, spending accounts, life insurance, voluntary plans, as well as participation in a 401(k) plan. System One is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, age, national origin, disability, family care or medical leave status, genetic information, veteran status, marital status, or any other characteristic protected by applicable federal, state, or local law. #M1 #LI-MA2 #DI- Ref: #431-IT Tampa #LI-BS1 #M1 System One, and its subsidiaries including Joulé, ALTA IT Services, CM Access, TPGS, and MOUNTAIN, LTD., are leaders in delivering workforce solutions and integrated services across North America. We help clients get work done more efficiently and economically, without compromising quality. System One not only serves as a valued partner for our clients, but we offer eligible full-time employees health and welfare benefits coverage options including medical, dental, vision, spending accounts, life insurance, voluntary plans, as well as participation in a 401(k) plan. System One is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, age, national origin, disability, family care or medical leave status, genetic information, veteran status, marital status, or any other characteristic protected by applicable federal, state, or local law.
    $61k-85k yearly est. 3d ago

Learn more about Mayo Clinic jobs

View all jobs