ETL Architect
Wisconsin jobs
Come Find Your Spark at Quartz! The ETL Architect will be responsible for the architecture, design, and implementation of data integration solutions and pipelines for the organization. This position will partner with multiple areas in the Enterprise Data Management team and the business to successfully translate business requirements into efficient and effective ETL implementations. This role will perform functional analysis, determining the appropriate data acquisition and ingestion methods, and design processes to populate various data platform layers. The ETL Architect will work with implementation stakeholders throughout the business to evaluate the state of data and constructs solutions that deliver data to enable analytics reporting capabilities in a reliable manner.
Skills this position will utilize on a regular basis:
* Informatica PowerCenter
* Expert knowledge of SQL development
* Python
Benefits:
* Opportunity to work with leading technology in the ever-changing, fast paced healthcare industry.
* Opportunity to work across the organization interacting with business stakeholders.
* Starting salary range based upon skills and experience: $107,500 - $134,400 - plus robust benefits package.
Responsibilities
* Architects, designs, enhances, and supports delivery of ETL solutions.
* Architects and designs data acquisition, ingestion, transformation, and load solutions.
* Identifies, develops, and documents ETL solution requirements to meet business needs.
* Facilitates group discussions and joins solution design sessions with technical subject matter experts.
* Develops, implements, and maintains standards and ETL design procedures.
* Contributes to the design of the data models, data flows, transformation specifications, and processing schedules.
* Coordinates ETL solution delivery and supports data analysis and information delivery staff in the design, development, and maintenance of data implementations.
* Consults and provides direction on ETL architecture and the implementation of ETL solutions.
* Queries, analyzes, and interprets complex data stored in the systems of record, enterprise data warehouse, and data marts.
* Ensures work includes necessary audit, HIPAA compliance, and security controls.
* Data Management
* Collaborates with infrastructure and platform administrators to establish and maintain scalable and reliable data processing environment for the organization.
* Identifies and triages data quality and performance issues from the ETL perspective and see them through to resolution.
* Tests and validates components of the ETL solutions to ensure successful end-to-end delivery.
* Participates in support rotation.
Qualifications
* Bachelor's degree with 8+ years of experience translating business requirements into business intelligence solutions, data visualization, and analytics solution design and development experience in a data warehouse and OLTP (Online Transaction Processing) environments, semantic layer modeling experience, and SQL programming experience.
* OR associate degree with 11+ years of experience translating business requirements into business intelligence solutions, data visualization, and analytics solution design and development experience in a data warehouse and OLTP environments, semantic layer modeling experience, and SQL programming experience.
* OR high school equivalence with 14+ years of experience translating business requirements into business intelligence solutions, data visualization, and analytics solution design and development experience in a data warehouse and OLTP environments, semantic layer modeling experience, and SQL programming experience.
* Expert understanding of ETL concepts and commercially available enterprise data integration platforms (Informatica PowerCenter, Python)
* Expert knowledge of SQL development
* Expert knowledge of data warehousing concepts, design principles, associated data management and delivery requirements, and best practices
* Expert problem solving and analytical skills
* Ability to understand and communicate data management and integration concepts within IT and to the business and effectively interact with all internal and external parties including vendors and contractors
* Ability to manage multiple projects simultaneously
* Ability to work independently, under pressure, and be adaptable to change
* Inquisitive and seek answers to questions without being asked
Hardware and equipment will be provided by the company, but candidates must have access to high-speed, non-satellite Internet to successfully work from home.
We offer an excellent benefit and compensation package, opportunity for career advancement and a professional culture built on the foundations of Respect, Responsibility, Resourcefulness and Relationships. To support a safe work environment, all employment offers are contingent upon successful completion of a pre-employment criminal background check.
Quartz values and embraces diversity and is proud to be an Equal Employment Opportunity employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, sex, gender identity or expression, sexual orientation, age, status as a protected veteran, among other things, or status as a qualified person with disability.
Auto-ApplyBI Data Architect
West Sacramento, CA jobs
DCI Donor Services (DCIDS) is looking for a dynamic and enthusiastic team member to join us to save lives!! Our mission at DCI Donor Services is to save lives through organ donation and we want professionals on our team that will embrace this important work!! We are currently seeking a BI Data Architect. The BI Data Architect will design, deploy, and maintain a modern Azure Lakehouse architecture leveraging Databricks and the Medallion model. This role is central to integrating data from iTransplant and other 3rd party applications, improving scalability, performance and data quality. The architect will lead technical implementations, set architectural standards and partner with Business Intelligence to enable data-driven decision-making across the organization. This is a remote position; however, candidates must be based in California.
COMPANY OVERVIEW AND MISSION
For over four decades, DCI Donor Services has been a leader in working to end the transplant waiting list. Our unique approach to service allows for nationwide donation, transplantation, and distribution of organs and tissues while maintaining close ties to our local communities.
DCI Donor Services operates three organ procurement/tissue recovery organizations: New Mexico Donor Services, Sierra Donor Services, and Tennessee Donor Services. We also maximize the gift of life through the DCI Donor Services Tissue Bank and Sierra Donor Services Eye Bank.
Our performance is measured by the way we serve donor families and recipients. To be successful in this endeavor is our ultimate mission. By mobilizing the power of people and the potential of technology, we are honored to extend the reach of each donor's gift and share the importance of the gift of life.
With the help of our employee-led strategy team, we will ensure that all communities feel welcome and safe with us because we are a model for fairness, belonging, and forward thinking.
Key responsibilities this position will perform include:
Design, build and maintain scalable end-to-end data pipelines using modern ETL/ELT and stream processing tools.
Architect and manage the Lakehouse environment (Databricks, Azure Data Lake), including Bronze/Silver/Gold Medallion layers.
Optimize data models for analytical and operational use, enabling self-service analytics through intuitive structures.
Establish and maintain architectural standards, data governance and security practices in a regulated environment.
Implement automated testing, CI/CD and monitoring frameworks to ensure data quality, reliability and system performance.
Collaborate with BI and technical teams to integrate new data sources, prepare technical specifications and improve visibility of data across the organization.
Document and maintain architecture, processes and data standards.
Performs other related duties as assigned.
The ideal candidate will have:
TECHNICAL SKILLS:
Strong expertise in SQL and Python
Experience with Azure Databricks and Lakehouse architecture (Medallion model) with knowledge of warehouse integration patterns
Proficiency in designing and implementing scalable data pipelines
Proficiency in dimensional modeling and data design for both warehouse and Lakehouse environments
Understanding of data governance principles and practices and data security
Familiarity with Power BI Service to support integration and enablement of self-service reporting
PHYSICAL TRAITS: Reads, writes, listens and observes. Communicates using both verbal and technological avenues. Walks, stands, lifts, and carries light loads.
QUALIFICATIONS:
Education Required:
Bachelor's degree in Computer Science, Data Science, Engineering, or a related technical field. Master's degree is preferred but not required. Equivalent combination of education and experience may be considered.
Experience:
Minimum of 7 years of professional experience in data engineering, with at least 3 years in a senior or lead role
Proven experience designing and implementing large-scale data pipelines and ELT processes
3+ years hands-on with Azure Databricks (or similar Spark-based platforms), with proven experience implementing Medallion architecture
Experience in applying data governance and security practices in regulated environments
Demonstrated ability to document and maintain data architecture, processes and standards
LICENSES/CERTIFICATION: Certifications in the following areas are preferred but not required
Azure Data Engineer Associate
Databricks Certified Professional Data Engineer
Databricks Certified Associate Developer for Apache Spark
Security or data governance certifications (CISA, CIPT, CISSP)
We offer a competitive compensation package including:
Up to 176 hours of PTO your first year
Up to 72 hours of Sick Time your first year
Two Medical Plans (your choice of a PPO or HDHP), Dental, and Vision Coverage
403(b) plan with matching contribution
Company provided term life, AD&D, and long-term disability insurance
Wellness Program
Supplemental insurance benefits such as accident coverage and short-term disability
Discounts on home/auto/renter/pet insurance
Cell phone discounts through Verizon
Monthly phone stipend
**New employees must have their first dose of the COVID-19 vaccine by their potential start date or be able to supply proof of vaccination.**
***This position does not offer visa sponsorship or OPT Training Plans.***
You will receive a confirmation e-mail upon successful submission of your application. The next step of the selection process will be to complete a video screening. Instructions to complete the video screening will be contained in the confirmation e-mail. Please note - you must complete the video screening within 5 days from submission of your application to be considered for the position.
DCIDS is an EOE/AA employer - M/F/Vet/Disability.
Auto-ApplyClinical Data Management
Remote
At Veracyte, we offer exciting career opportunities for those interested in joining a pioneering team that is committed to transforming cancer care for patients across the globe. Working at Veracyte enables our employees to not only make a meaningful impact on the lives of patients, but to also learn and grow within a purpose driven environment. This is what we call
the Veracyte way
- it's about how we work together, guided by our values, to give clinicians the insights they need to help patients make life-changing decisions.
Our Values:
We Seek A Better Way: We innovate boldly, learn from our setbacks, and are resilient in our pursuit to transform cancer care
We Make It Happen: We act with urgency, commit to quality, and bring fun to our hard work
We Are Stronger Together: We collaborate openly, seek to understand, and celebrate our wins
We Care Deeply: We embrace our differences, do the right thing, and encourage each other
Position Overview:
The Clinical Data Manager is responsible for participating all aspects of Clinical Data Management Operations at Veracyte, ensuring data integrity and quality for clinical studies. This is a hands-on role that requires technical expertise in the complete data management lifecycle.
This is a remote role, with a strong preference for someone in San Diego, CA or San Francisco, CA
Key Responsibilities:
• Support end-to-end clinical data management operations, from protocol design to database closure
• Support the implementation of comprehensive data management plans, validation specifications, and quality control procedures
• Participate in the design and validation of eCRF systems and edit checks aligned with protocol requirements
• Support database development, validation programming, and query management
• Collaborate with Clinical Affairs, Data Analysis, and IT teams to establish data collection methods and quality standards
• Generate key metrics reports and data analytics for clinical studies.
Who You Are:
Bachelor's degree in Computer Science, Life Sciences, or related field
4+ years hands-on experience in IVD, Medical Device, or Pharmaceutical clinical data management
Knowledge of GCP and GCDMP and proficiency in the following:
Programming languages (R, SAS)
Database management (SQL, PL/SQL)
EDC systems and clinical data management platforms (e.g. Medidata, Medrio)
CTMS and eTMF platforms with strong preference for experience with Veeva Vault
Sample management platforms (e.g. LabVantage)
Microsoft Office Suite
Technical Expertise:
CDISC/CDASH/SDTM/ADAM standards
FDA guidelines and regulations
Database validation and quality control processes
Clinical trials
Experience with Medidata
Experience with Veeva Vault
Experience with LabVantage
Clinical trial data workflows
Teamwork and collaboration Competencies:
Excellence in within and cross-functional team collaboration
Clear communication of technical concepts to non-technical stakeholders
Proactive issue identification and resolution
Ability to work independently while maintaining team alignment
Impact: This role directly contributes to improving patient outcomes by ensuring the highest quality clinical data management standards in diagnostic testing development and validation.
#LI-Remote
The final salary offered to a successful candidate will be dependent on several factors that may include but are not limited to years of experience, skillset, geographic location, industry, education, etc. Base pay is one part of the Total Package that is provided to compensate and recognize employees for their work, and this role may be eligible for additional discretionary bonuses/incentives, and restricted stock units.
Pay range$112,000-$127,000 USDWhat We Can Offer You
Veracyte is a growing company that offers significant career opportunities if you are curious, driven, patient-oriented and aspire to help us build a great company. We offer competitive compensation and benefits, and are committed to fostering an inclusive workforce, where diverse backgrounds are represented, engaged, and empowered to drive innovative ideas and decisions. We are thrilled to be recognized as a 2024 Certified™ Great Place to Work in both the US and Israel - a testament to our dynamic, inclusive, and inspiring workplace where passion meets purpose.
About Veracyte
Veracyte (Nasdaq: VCYT) is a global diagnostics company whose vision is to transform cancer care for patients all over the world. We empower clinicians with the high-value insights they need to guide and assure patients at pivotal moments in the race to diagnose and treat cancer. Our Veracyte Diagnostics Platform delivers high-performing cancer tests that are fueled by broad genomic and clinical data, deep bioinformatic and AI capabilities, and a powerful evidence-generation engine, which ultimately drives durable reimbursement and guideline inclusion for our tests, along with new insights to support continued innovation and pipeline development. For more information, please visit **************** or follow us on LinkedIn or X (Twitter).
Veracyte, Inc. is an Equal Opportunity Employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status or disability status. Veracyte participates in E-Verify in the United States. View our CCPA Disclosure Notice.
If you receive any suspicious alerts or communications through LinkedIn or other online job sites for any position at Veracyte, please exercise caution and promptly report any concerns to ********************
Auto-Apply
About this role
Transcarent is seeking a Data Analyst III to help drive engagement with our products and services among our member base via actionable reporting and insights. As part of our analytics team, you'll partner closely with marketing and product management to deliver strategic analyses focused on growth, with an end goal of driving engagement and connecting our members to the right healthcare offerings.
This role requires exceptional quantitative and critical thinking skills as well as a strong understanding of marketing and product analytics. This is a high-visibility role and the ability to communicate complex findings and actionable insights clearly to an executive-level audience is a must. In addition, this individual must be able to collaborate effectively across multiple functions and thrive in a fast-paced environment.
What you'll do
Create reporting and analytics for growth and member marketing, including email and mail campaign reporting and product analytics
Deliver actionable insights around growth opportunities, inefficiencies and user pain points, leading to campaign and funnel optimizations
Help design and measure performance of marketing tests to improve our targeting, creative, and calls to action
Design compelling data visualizations in Tableau that empower business leaders to make strategic decisions
Build end-to-end product and funnel analytics within Mixpanel that provide insight into the user journey on our web and app-based products, from activation to utilization of healthcare services
Provide requirements on tracking events needed for KPI measurement and feature readouts, working with product and engineering
Partner with data engineering on marketing database schemas and data quality
What we're looking for
4+ years of relevant professional experience including 2+ years in marketing or product analytics
Proficiency in SQL (Redshift, MySQL)
Expertise in dashboard development using BI tools such as Tableau
Experience with analytics tools such as Mixpanel, Amplitude, or Google Analytics
Familiarity with Python or R for complex analyses
Bachelor's degree preferably in a quantitative discipline; advanced degree a plus
As a remote position, the salary range for this role is:$79,800-$110,000 USD
Who we are
Transcarent and Accolade have come together to create the One Place for Health and Care, the leading personalized health and care experience that delivers unmatched choice, quality, and outcomes. Transcarent's AI-powered WayFinding, comprehensive Care Experiences - Cancer Care, Surgery Care, Weight - and Pharmacy Benefits offerings combined with Accolade's health advocacy, expert medical opinion, and primary care, allows us to meet people wherever they are on their health and care journey. Together, more than 20 million people have access to the combined company's offerings. Employers, health plans, and leading point solutions rely on us to provide trusted information, increase access, and deliver care.
We are looking for teammates to join us in building our company, culture, and Member experience who:
Put people first, and make decisions with the Member's best interests in mind
Are active learners, constantly looking to improve and grow
Are driven by our mission to measurably improve health and care each day
Bring the energy needed to transform health and care, and move and adapt rapidly
Are laser focused on delivering results for Members, and proactively problem solving to get there
Total Rewards
Individual compensation packages are based on a few different factors unique to each candidate, including primary work location and an evaluation of a candidate's skills, experience, market demands, and internal equity.
Salary is just one component of Transcarent's total package. All regular employees are also eligible for the corporate bonus program or a sales incentive (target included in OTE) as well as stock options.
Our benefits and perks programs include, but are not limited to:
Competitive medical, dental, and vision coverage
Competitive 401(k) Plan with a generous company match
Flexible Time Off/Paid Time Off, 12 paid holidays
Protection Plans including Life Insurance, Disability Insurance, and Supplemental Insurance
Mental Health and Wellness benefits
Transcarent is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. If you are a person with a disability and require assistance during the application process, please don't hesitate to reach out!
Research shows that candidates from underrepresented backgrounds often don't apply unless they meet 100% of the job criteria. While we have worked to consolidate the minimum qualifications for each role, we aren't looking for someone who checks each box on a page; we're looking for active learners and people who care about disrupting the current health and care with their unique experiences.
Auto-ApplyData Architect- REMOTE
Santa Monica, CA jobs
Catasys is making a positive impact on people's lives every day. We use predictive analytics to identify health plan members with unaddressed behavioral health conditions that worsen chronic disease, then engage, support and guide these members to better health with a personalized, human-centered approach. This has led us to where we are today: growing fast and saving lives as we do.
To support our explosive growth, we're looking for compassionate, hard-working people-lovers to join our team. If innovating in the field of patient care is something you're passionate about, we encourage you to join our mission to improve the health and save the lives of as many people as possible.
Impact lives in so many ways
You'll be an integral part in supporting people coping with their unique life challenges. Every member of the Catasys team contributes to accomplishing our goals and upholding our people-centric values.
The new face of mental health
Our model is research-based, and we are invested in staying on the leading edge of treatment. You'll help us break down barriers and stigmas associated with mental health.
Career options
Our ongoing strong growth and evolution, we are looking for people who want to do their best at work. Join our team and take your career to the next level with Catasys. We are committed to promoting from within.
Excellent compensation
Job Description
In this key role, you will be a data architect
defining and growing the data infrastructur
e and data supported by applications used by Catasys colleagues who work daily to improve and save the lives of those suffering from the medical consequences of untreated behavioral health conditions.
You will work in a highly autonomous and low supervision environment with a geographically distributed team creating and delivering successful applications from whiteboard to market scale. You will live in a highly collaborative, delivery-focused team environment and will be equally at home designing an MDM strategy, troubleshooting DB performance, building out a data strategy or helping a developer implement a new document collection.
You will
display the ability to be a critical thinker
and tackle problems by first evaluating the problem and then thinking of several potential solutions. You will
be responsible for all phases of new data solution implementation
displaying the ability to lead in the delivery of the solutions.
To you, balancing security and accessibility, system design and architecture, reliability engineering and fault diagnosis are not esoteric terms, but rather fuel for the obsession that drives your daily war against mediocrity. Basically, your qualification for this job is proven experience operating in a high performing and fault intolerant environment. Your objective is to anticipate the needs of the organization and work to ensure that the data architecture provides value for the entire organization
Qualifications
Bachelor's degree in Computer Science or "STEM" majors (science, technology, engineering, or math)
5 or more years of data architecture experience
Excellent communication both written and verbal
Experience with relational and No SQL data structures
Experience with Data Lakes and technologies
Experience in deploying a Master Data Management (MDM) solution
Familiar with Python scripting language
Experience with data warehouse implementations
Data visualization experience
Additional Information
All your information will be kept confidential according to EEO guidelines.
REMOTE Data Analyst
Los Angeles, CA jobs
Catasys is making a positive impact on people's lives every day. We use predictive analytics to identify health plan members with unaddressed behavioral health conditions that worsen chronic disease, then engage, support and guide these members to better health with a personalized, human-centered approach. This has led us to where we are today: growing fast and saving lives as we do.
To support our explosive growth, we're looking for compassionate, hard-working people-lovers to join our team. If innovating in the field of patient care is something you're passionate about, we encourage you to join our mission to improve the health and save the lives of as many people as possible.
Impact lives in so many ways
You'll be an integral part in supporting people coping with their unique life challenges. Every member of the Catasys team contributes to accomplishing our goals and upholding our people-centric values.
The new face of mental health
Our model is research-based, and we are invested in staying on the leading edge of treatment. You'll help us break down barriers and stigmas associated with mental health.
Career options
Our ongoing strong growth and evolution, we are looking for people who want to do their best at work. Join our team and take your career to the next level with Catasys. We are committed to promoting from within.
Excellent compensation
Job Description
As a Data Analyst, you will
drive innovation, growth, and contribute
to the company's ability to scale. Your work continually broadens access to reliable, accurate, and timely data to improve decision making. You will transform data into insights, leading to faster and more extensive exploratory analysis and quicker action based on evidence. Your insights will help grow the business by accelerating sales and customer expansion cycles, reinforcing Catasys' position as an industry leader in data and analytics. You excel at synthesize and communicate complex concepts and analyses in easy to understand ways.
Responsibilities
Dive into data to predict and quantify user behavior: our members, Care Team, and network providers.
Find actionable strategic insights through funnels, cohort analyses, user segmentation, retention analyses and regression models to help us grow our products.
Data storytelling: quantify user journeys to help identify opportunities to improve member outcomes and team productivity.
Become a Catasys subject matter expert to understand and anticipate the data needs of customers, Product, User Experience, and internal stakeholders.
Translate high-priority business problems to solve into concise measures.
Lead the Analytics Center of Excellence: a cross-functional team of Data Champions within the organization.
Drive a culture of analytical rigor and transparency, and shared understanding of measures
Work both collaboratively and autonomously.
Define KPIs, build automated dashboards, reports, and models to help teams make faster better decisions.
Work with engineering and product to implement, quality assurance, and monitor our logging and metrics.
Qualifications
Bachelor's Degree in Computer Science, math, economics, statistics, or other quantitative fields
2+ years' experience with PowerBI DAX Programming
Expertise performing quantitative analysis
Excellent communication and presentation skills: you understand your audience and how to effectively present information to diverse stakeholders
Strong understanding of statistical methods and applications (A/B testing, probability, regression)
Additional Information
This position is REMOTE.
Data Architect - Jersey City
Remote
Key Responsibilities:
Develop and optimize Spark ETL pipelines using Scala/Python and Oracle SQL.
Lead a team of engineers, ensuring best practices and high performance.
Collaborate with stakeholders to define and implement scalable data solutions.
Ensure data integrity, security, and performance optimization.
Stay updated on industry trends and emerging tech to drive continuous improvements.
Required Skills and Qualifications:
15+ years of experience in data engineering or related fields.
Strong expertise in Oracle SQL, including query optimization and performance tuning.
Hands-on experience with Apache Spark for ETL processing.
Proficiency in Scala and/or Python for data engineering tasks.
Experience leading teams, managing projects, and mentoring engineers.
Solid understanding of data warehousing, distributed computing and cloud platform.
Strong problem-solving skills and the ability to work in a fast-paced environment.
Preferred Qualifications:
Familiarity with CI/CD pipelines and DevOps practices
Knowledge of real-time data processing frameworks like Kafka.
Compensation, Benefits and Duration
Minimum Compensation: USD 64,000
Maximum Compensation: USD 225,000
Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role.
Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees.
This position is not available for independent contractors
No applications will be considered if received more than 120 days after the date of this post
Auto-ApplyData Architect_New York
Remote
We are seeking a Data and AI Architect with a proven track record in designing and implementing data platforms and AI solutions using cloud technologies (AWS, Azure, GCP), Databricks, and Snowflake. The ideal candidate will combine technical expertise, hands-on experience, and leadership capabilities to deliver impactful solutions for our clients, including the development and deployment of GenAI solutions for next-generation business applications.
Key Responsibilities:
Solution Design and Implementation
Architect and implement modern data platforms using cloud technologies such as AWS, Azure, and GCP.
Build scalable and efficient data pipelines leveraging Databricks and Snowflake to support diverse client needs.
Design and deploy AI/GenAI solutions, including machine learning models, natural language processing tools, and predictive analytics, to address complex business challenges.
Data Governance and Quality
Develop data governance frameworks to ensure data quality, security, and compliance with industry regulations.
Implement best practices for data lineage, monitoring, and cataloging using tools like Databricks and Snowflake.
Work with clients to establish policies for managing data assets and ensuring compliance with GDPR, CCPA, and other regulatory requirements.
Client Engagement and Delivery
Act as a trusted technical advisor to clients, translating business challenges into actionable data and AI strategies.
Lead workshops, provide guidance on architecture decisions, and ensure alignment with client objectives.
Manage solution delivery, including resource planning, timeline tracking, and adherence to budget constraints.
Technology Expertise and Innovation
Stay current on advancements in cloud platforms, Databricks, Snowflake, and AI technologies to recommend innovative solutions.
Explore and implement GenAI technologies to create transformative solutions such as personalized customer experiences, generative content creation, and intelligent automation.
Contribute to the development of reusable frameworks and accelerators to enhance delivery efficiency.
Team Leadership and Development
Provide technical leadership to cross-functional teams, ensuring high-quality delivery of data and AI projects.
Mentor team members on cloud technologies, data platforms, and AI frameworks.
Foster a collaborative environment to encourage innovation and continuous learning.
Qualifications:
Experience: 6-10 years of experience in data architecture, engineering, or analytics, including client-facing or consulting roles.
Cloud Expertise: Hands-on experience with AWS, Azure, and GCP, including data storage, compute, and analytics services.
Data Platforms: Strong knowledge of Databricks and Snowflake, including experience with ETL/ELT workflows, performance tuning, and scaling.
AI/GenAI Knowledge: Experience designing and deploying AI/GenAI solutions using frameworks like TensorFlow, PyTorch, Hugging Face, and advanced large language models (LLMs) such as OpenAI's GPT and Google's Gemini.
Technical Skills: Proficiency in SQL, Python, Spark, and other data engineering and analytics tools.
Leadership: Ability to lead technical teams and manage client engagements effectively.
Education: Bachelor's degree in Computer Science, Data Science, or a related field; advanced degree preferred.
Preferred Skills:
Certification in AWS, Azure, or GCP architecture.
Familiarity with tools such as dbt, Apache Airflow, and Kafka.
Knowledge of customer data platforms and data-driven marketing strategies.
Compensation, Benefits and Duration
Minimum Compensation: USD 72,000
Maximum Compensation: USD 252,000
Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role.
Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees.
This position is available for independent contractors
No applications will be considered if received more than 120 days after the date of this post
Auto-ApplyData Architect - Data Science - NV
Remote
Responsibilities:
Lead the design and implementation of robust, scalable, and secure data architectures, including data lakes, warehouses, and pipelines.
Stay ahead of emerging trends and technologies in data engineering and architecture. Provide thought leadership in data modeling, storage, and processing frameworks.
Guide and mentor data engineers and analysts, promoting best practices and high standards in data design and governance.
Drive continuous improvement in data performance, including query optimization, latency reduction, and efficient resource utilization.
Define and enforce data security measures to protect sensitive information, ensure compliance with privacy regulations (e.g., GDPR, HIPAA), and prevent unauthorized access.
Ensure high-quality, maintainable data solutions using industry standards. Conduct and review data models and ETL code to uphold consistency and best practices.
Analyze complex technical challenges in data environments and develop innovative, scalable solutions.
Requirements:
10+ years of experience in data architecture, engineering, or analytics, from design to implementation.
Extensive experience with data platforms such as Snowflake, Azure Synapse, Google BigQuery, or AWS Redshift.
Deep understanding of data modeling techniques (e.g., star schema, snowflake schema, normalized/denormalized models) and data governance frameworks.
Proficient in building and managing ETL/ELT pipelines using tools like Apache Airflow, Azure Data Factory, Talend, or Informatica.
Experience integrating data systems with cloud platforms (AWS, Azure, GCP) and APIs using tools like Mulesoft, Azure Functions, or API Management.
Familiarity with CI/CD tools and automation workflows for data, such as Azure DevOps, GitHub Actions, and infrastructure management using Terraform.
Experience with data monitoring and observability tools like Datadog, Splunk, or Azure Monitor.
Strong understanding of data storage strategies, including relational databases (SQL Server, PostgreSQL), NoSQL (MongoDB, Cassandra), and file-based systems (Parquet, Avro).
Excellent verbal and written communication skills, with the ability to explain technical concepts to both technical and non-technical stakeholders.
Customer Communication:
Actively listen to client needs and translate them into scalable data solutions.
Communicate technical ideas clearly and build lasting relationships based on trust and delivery.
Keep stakeholders informed on progress, issues, and resolution plans.
Stakeholder Management:
Identify key stakeholders and manage expectations effectively.
Resolve conflicts diplomatically and align stakeholders with product and technical goals.
Promote change management and ensure transparent communication throughout.
Cross-Functional Team Coordination:
Collaborate with analytics, engineering, QA, and DevOps teams to ensure smooth execution.
Align multi-disciplinary teams towards shared milestones.
Ensure clarity of data requirements across the product development lifecycle.
Preferred Qualifications:
Experience implementing Agile methodologies (Scrum, Kanban) using tools like JIRA and Confluence.
Experience with data cataloging and lineage tools like Collibra, Alation, or Microsoft Purview.
Familiarity with data accessibility standards, data quality frameworks, and test automation for data pipelines.
Compensation, Benefits and Duration
Minimum Compensation: USD 66,000
Maximum Compensation: USD 233,000
Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role.
Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees.
This position is available for independent contractors
No applications will be considered if received more than 120 days after the date of this post
Auto-ApplyData Architect | US
Remote
Key Responsibilities:
Data Architecture Design:
Design scalable, secure, and efficient data architectures on Google Cloud Platform (GCP) to support current and future business needs.
Create comprehensive data models, workflows, and frameworks to enable high-quality data storage, access, and processing on GCP.
Collaborate with technical teams to align the architecture with GCP best practices, ensuring the efficient use of GCP services such as BigQuery, Cloud Storage, Dataproc, and Pub/Sub.
Discovery & Assessment:
Conduct a thorough assessment of the existing on-premise or cloud data infrastructure to identify gaps, inefficiencies, and opportunities for migration to GCP.
Lead data discovery sessions with stakeholders to gather business requirements and map them to appropriate GCP solutions.
Identify data quality, data lineage, and metadata management needs during the discovery phase and incorporate them into the proposed architecture.
Data Migration Strategy:
Develop a data migration strategy, outlining the migration of databases, data warehouses, and data lakes from legacy systems to GCP.
Provide expertise in designing ETL/ELT processes using tools such as Cloud Dataflow, Dataproc, or other GCP data integration services.
Ensure data integrity, security, and minimal downtime during the migration process, leveraging GCP's native tools for migration and replication.
Data Governance & Security:
Establish a data governance framework that ensures the security, privacy, and compliance of data across the GCP environment.
Implement GCP data security best practices, including encryption, Identity and Access Management (IAM), and data masking to protect sensitive information.
Ensure that the architecture complies with industry-specific regulations such as GDPR/CCPA, or CCPA where applicable.
Collaboration & Stakeholder Engagement:
Collaborate with cross-functional teams, including cloud architects, data engineers, and business stakeholders, to ensure alignment between business needs and data architecture.
Act as the primary point of contact for data-related discussions, providing guidance on data governance, performance optimization, and scalability on GCP.
Participate in workshops and meetings to present architecture designs, data migration plans, and GCP recommendations.
Optimization & Performance:
Optimize the architecture for high availability, scalability, and performance, leveraging GCP services such as BigQuery, Cloud Spanner, and Bigtable.
Design data models that are optimized for both transactional (OLTP) and analytical (OLAP) workloads.
Implement monitoring and alerting mechanisms to track data pipeline health and performance, using tools like GCP's Stackdriver or Cloud Monitoring.
Documentation & Reporting:
Create detailed documentation of the data architecture, including data flow diagrams, ER diagrams, metadata definitions, and security models.
Prepare executive-level reports and presentations on the progress of data discovery, architecture design, and migration status.
Maintain detailed records of all data-related decisions made during the GCP discovery engagement for future reference.
Required Qualifications & Skills:
Experience:
7+ years of experience in data architecture or related roles, with a focus on cloud environments.
Proven experience in designing and implementing data architectures on Google Cloud Platform (GCP) or other cloud platforms (AWS, Azure).
Hands-on experience in data migration projects, data modeling, and database design in cloud environments.
Experience working with large-scale data warehouses, data lakes, and complex ETL/ELT processes.
Technical Expertise:
Deep understanding of GCP services related to data processing and storage, such as BigQuery, Cloud Spanner, Bigtable, Cloud SQL, Cloud Storage, Dataproc, and Dataflow.
Proficiency in SQL, NoSQL databases, and data modeling best practices.
Experience with data integration tools and frameworks (e.g., Apache Beam, Dataflow, Dataprep, or Informatica).
Data Governance & Security:
Strong knowledge of data governance principles, data lineage, metadata management, and data cataloging on cloud platforms.
Expertise in implementing data security and compliance measures, including encryption, IAM, and data privacy regulations.
Familiarity with relevant regulatory frameworks (GDPR/CCPA, or PCI-DSS) and their implications on cloud data architecture.
Soft Skills:
Strong problem-solving skills with the ability to translate business requirements into technical data solutions.
Excellent communication and presentation skills, with the ability to interact with both technical and non-technical stakeholders.
Ability to lead and mentor data engineering teams on best practices for GCP data architecture and governance.
Certifications (Preferred):
Google Professional Data Engineer certification (preferred).
Google Professional Cloud Architect or other GCP certifications are advantageous.
Additional certifications in data management, such as Certified Data Management Professional (CDMP) or equivalent, are a plus.
Education:
Bachelor's degree in Computer Science, Data Science, Information Systems, or a related field.
Master's degree (preferred) or equivalent experience in data architecture and cloud environments.
Auto-ApplyData and AI Architect | Onsite | New York
Remote
Title: Data and AI Architect (Lead/Manager Level)
We are seeking a Data and AI Architect with a proven track record in designing and implementing data platforms and AI solutions using cloud technologies (AWS, Azure, GCP), Databricks, and Snowflake. The ideal candidate will combine technical expertise, hands-on experience, and leadership capabilities to deliver impactful solutions for our clients, including the development and deployment of GenAI solutions for next-generation business applications.
Key Responsibilities:
Solution Design and Implementation
Architect and implement modern data platforms using cloud technologies such as AWS, Azure, and GCP.
Build scalable and efficient data pipelines leveraging Databricks and Snowflake to support diverse client needs.
Design and deploy AI/GenAI solutions, including machine learning models, natural language processing tools, and predictive analytics, to address complex business challenges.
Data Governance and Quality
Develop data governance frameworks to ensure data quality, security, and compliance with industry regulations.
Implement best practices for data lineage, monitoring, and cataloging using tools like Databricks and Snowflake.
Work with clients to establish policies for managing data assets and ensuring compliance with GDPR, CCPA, and other regulatory requirements.
Client Engagement and Delivery
Act as a trusted technical advisor to clients, translating business challenges into actionable data and AI strategies.
Lead workshops, provide guidance on architecture decisions, and ensure alignment with client objectives.
Manage solution delivery, including resource planning, timeline tracking, and adherence to budget constraints.
Technology Expertise and Innovation
Stay current on advancements in cloud platforms, Databricks, Snowflake, and AI technologies to recommend innovative solutions.
Explore and implement GenAI technologies to create transformative solutions such as personalized customer experiences, generative content creation, and intelligent automation.
Contribute to the development of reusable frameworks and accelerators to enhance delivery efficiency.
Team Leadership and Development
Provide technical leadership to cross-functional teams, ensuring high-quality delivery of data and AI projects.
Mentor team members on cloud technologies, data platforms, and AI frameworks.
Foster a collaborative environment to encourage innovation and continuous learning.
Qualifications:
Experience: 6-10 years of experience in data architecture, engineering, or analytics, including client-facing or consulting roles.
Cloud Expertise: Hands-on experience with AWS, Azure, and GCP, including data storage, compute, and analytics services.
Data Platforms: Strong knowledge of Databricks and Snowflake, including experience with ETL/ELT workflows, performance tuning, and scaling.
AI/GenAI Knowledge: Experience designing and deploying AI/GenAI solutions using frameworks like TensorFlow, PyTorch, Hugging Face, and advanced large language models (LLMs) such as OpenAI's GPT and Google's Gemini.
Technical Skills: Proficiency in SQL, Python, Spark, and other data engineering and analytics tools.
Leadership: Ability to lead technical teams and manage client engagements effectively.
Education: Bachelor's degree in Computer Science, Data Science, or a related field; advanced degree preferred.
Preferred Skills:
Certification in AWS, Azure, or GCP architecture.
Familiarity with tools such as dbt, Apache Airflow, and Kafka.
Knowledge of customer data platforms and data-driven marketing strategies.
Compensation, Benefits and Duration
Minimum Compensation: USD 100,000
Maximum Compensation: USD 300,000
Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role.
Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees.
This position is not available for independent contractors
No applications will be considered if received more than 120 days after the date of this post
Auto-ApplyData Architect | Onsite | Dallas/Charlotte
Remote
Job Description: Data Architect
We are seeking an experienced Data Architect to design and implement robust data architectures, ensuring seamless data flow, storage, and management across the organization.
Key Responsibilities: Data Architecture Design
Define and implement end-to-end data architecture solutions, including data ingestion, storage, processing, and analytics.
Design data models (conceptual, logical, and physical) to meet business and analytical requirements.
Evaluate and recommend database solutions (e.g., RDBMS, NoSQL, Data Lakes, Data Warehouses) based on business needs.
Data Integration and Management
Architect scalable ETL/ELT pipelines for data extraction, transformation, and loading using tools like Informatica, Talend, or Azure Data Factory.
Ensure seamless data integration across on-premises and cloud environments, leveraging APIs, messaging systems (e.g., Kafka, RabbitMQ), and streaming platforms.
Oversee data lifecycle management, including storage optimization, archiving, and retention policies.
Cloud Data Solutions
Design and implement cloud-based data platforms (e.g., Azure Synapse, AWS Redshift, Google BigQuery) for scalability and cost-effectiveness.
Optimize data solutions for performance, reliability, and availability in cloud environments.
Data Governance and Security
Establish and enforce data governance policies, including data quality, metadata management, and data lineage.
Ensure data security and compliance with industry standards and regulations (e.g., GDPR, HIPAA, CCPA).
Implement role-based access control (RBAC), encryption, and other security measures to safeguard data assets.
Data Analytics Enablement
Collaborate with business intelligence and analytics teams to ensure data architecture supports reporting and advanced analytics needs.
Enable real-time analytics through integration of data streaming and event-driven architectures.
Support AI/ML model development by providing high-quality, accessible datasets.
Collaboration and Documentation
Partner with stakeholders to align data architecture with business objectives and technical requirements.
Document data architecture designs, data flows, and technical specifications for stakeholders and development teams.
Provide guidance and mentorship to data engineers and developers on best practices and architectural standards.
Optimization and Innovation
Continuously monitor and optimize data architectures for performance, scalability, and cost-efficiency.
Evaluate emerging data technologies and trends to drive innovation in data architecture strategies.
Key Qualifications: Technical Expertise
Proven experience in data modeling, data warehousing, and database design for large-scale systems.
Strong expertise with database technologies (e.g., SQL Server, Oracle, PostgreSQL, MongoDB, Cassandra).
Proficiency in cloud platforms (e.g., Azure, AWS, GCP) and their data services.
Hands-on experience with data integration tools, streaming platforms (e.g., Apache Kafka, Flink, Spark), and ETL processes.
Knowledge of big data ecosystems, including Hadoop, Hive, and Spark.
Preferred Skills
Experience with modern data architecture patterns, such as Data Lakes, Data Mesh, or Data Fabric.
Familiarity with BI tools (e.g., Power BI, Tableau, Looker) and advanced analytics platforms.
Expertise in data pipeline orchestration tools like Apache Airflow or Prefect.
Knowledge of machine learning and AI data preparation pipelines is a plus.
Soft Skills
Strong problem-solving and analytical thinking abilities.
Excellent communication and stakeholder management skills.
Ability to work collaboratively in cross-functional teams and adapt to evolving business priorities.
Education and Experience:
Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
8+ years of experience in data architecture, data engineering, or related roles.
Relevant certifications such as Azure Data Engineer Associate, AWS Certified Data Analytics, or Google Professional Data Engineer are preferred.
Compensation, Benefits and Duration
Minimum Compensation: USD 56,000
Maximum Compensation: USD 224,000
Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role.
Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees.
This position is available for independent contractors
No applications will be considered if received more than 120 days after the date of this post
Auto-ApplyOnsite/Offshore Data Architect - Snowflake
Remote
We're looking for a Data Solution Architect to design and deliver enterprise-scale data and analytics platforms. The ideal candidate has deep hands-on experience in Snowflake architecture, large analytical/reporting store design, and migration from Databricks to Snowflake.
You'll lead architecture across data mesh, lakehouse, and medallion patterns, designing secure and scalable single-tenant and multi-tenant solutions. The role includes defining data encryption, masking, and governance frameworks, integrating Snowflake with Azure services such as Data Lake Storage, Event Grid, Service Bus, Purview, Power BI, and Fabric.
We're seeking someone who understands Control Plane vs. Operations (Data) Plane, modern data architecture principles, and cloud-native compute (Functions, AKS, Container Apps). Strong communication, stakeholder alignment, and documentation skills (C4/ArchiMate) are key.
Requirements:
• 10+ years in data architecture/solution design (3+ years Snowflake).
• Experience with Databricks migration, Azure ecosystem, and data security (encryption/masking).
• Familiar with CI/CD, IaC (Terraform/Bicep), microservices, API-first, and event-driven design.
• Certifications like Azure Architect, SnowPro, TOGAF, or CDMP preferred.
Work Mode: 100% Remote (US or time zones).
Join us to shape next-generation data platforms that are secure, scalable, and future-ready.
Climate Data Consultant, Data and Analytics Section, DAPM, NYHQ, remote. req#585089
Remote
If you are a committed, creative professional and are passionate about making a lasting difference for children, the world's leading children's rights organization would like to hear from you. For 70 years, UNICEF has been working on the ground in 190 countries and territories to promote children's survival, protection and development. The world's largest provider of vaccines fordeveloping countries, UNICEF supports child health and nutrition, good water and sanitation, quality basic education for all boys and girls, and the protection of children from violence, exploitation, and AIDS. UNICEF is funded entirely by the voluntary contributions of individuals, businesses, foundations and governments. UNICEF has over 12,000 staff in more than 145 countries.
Consultancy: Climate Data Consultancy
Duty Station: Data and Analytics Section, DAPM, NYHQ
Duration: 01 Nov 2025 - 31 Oct 2026
Home/ Office Based: Remote
BACKGROUND
Purpose of Activity/ Assignment:
UNICEF has established a new Global Child Hazard Database to estimate the exposure of children and critical infrastructures to single and multiple climate-related hazards. The purpose of this assignment is to enhance and scale the core data processing pipeline for the Database by integrating new data streams, improving processing efficiency, and preparing the system for broader deployment and increased data volume. Integrating different data sources and platforms in cloud-based server environment.
Scope of Work:
Scope of Work:
Under the supervision and guidance of the Climate & Environment Data Science Specialist, the consultant will have the following duties and responsibilities:
1. Expand and Optimize Data Pipelines:
• Scale the current data pipeline to create sub-national outputs with expanded pre-determined attributes, leveraging cloud resources using Google Earth Engine.
• Enhance/expand the existing codebase for optimization and scaling, ensuring it can handle increased data volume and complexity.
2. Enhance Data Visualization:
• Support the enhancement of the existing Google Earth Engine (GEE) data visualization application by developing and integrating server-side functions.
• Support integrating different data platforms for report generation.
3. Data Analysis and Documentation:
• Support downstream analysis and fulfill additional data processing requests.
• Collaborate closely with the Statistics and Monitoring Manager and the Data Science Specialist to document and manage knowledge of all new and existing processes
Terms of Reference / Key Deliverables:
Work Assignment Overview/Deliverables and Outputs/Delivery deadline
1. Scaling the Core Pipeline for the Global Child Hazard Database (expanding the pipeline to include sub-national outputs.)
- Pipeline Analysis and Architecture Blueprint (V0.9). Deliverable includes: (1) Comprehensive audit report of current pipeline performance, (2) Finalized list of 5+ sub-national data requirements, and (3) Formal architecture blueprint for GEE integration
30 Nov 2025
- GEE Processing Module Prototype (V1.0). Deliverable is a fully working Google Earth Engine script successfully ingesting and transforming one full-scale sub-national dataset, validating the core technology concept.
31 Dec 2025
- Production Module, Integration, and Documentation Package. Deliverable includes: (1) V2.0 module validated against 3 datasets (achieving >95% data quality), (2) Confirmed integration and API documentation for 3 internal platforms, including UNICEF's data warehouse, and (3) Final New Data Flow Manual.
31 Jan 2026
2. Support migrating to Google cloud
- Enhanced GEE Application Release (V3.0). Deliverable includes: (1) tested server-side functions and (2) new interactive data visualization features integrated into the user interface
31 Mar 2026
3. Additional data to the Global Child Hazard Database
- Database Enhancement and Ingestion Report. Deliverable is a report confirming the identification, validation, and verified successful ingestion of new, high-priority data sources into the Global Child Hazard Database
31 May 2026
4. Additional functions to visualization app
- Automated Data Pipeline Implementation. Deliverable is the deployment and full documentation of at least two new, fully automated data pipelines for the sources identified in the Database Enhancement Report
31 Jul 2026
5. Optimization & Enhancement (Enhancing the existing codebase for efficiency and preparing for increased scale.)
- Code Optimization and Scaling Report. Deliverable must demonstrate a 25% reduction in execution time for the top 5 most resource-intensive scripts, and document the implementation of at least two specific features designed for increased data volume.
31 Aug 2026
6. On-going country support, update of technical documentation & project finalization (Providing continuous support for analysis, ad-hoc requests, and knowledge transfer)
- Project Finalization and Knowledge Transfer Package. Deliverable includes: (1) Twelve Monthly Support Logs, (2) Final Project Report, and (3) A minimum 1-hour recorded Knowledge Transfer Session with supporting materials.
31 Oct 2026
Qualifications
Education:
Bachelor's in computer science, data science, geospatial technology, or any other related discipline
Experience working with Google Earth Engine
Advanced proficiency in Python
Knowledge/Expertise/Skills required *:
A university degree in Computer Science, Data Science, Geospatial Technology, Remote Sensing, or any other closely related quantitative discipline is required.
• Demonstrated 2+ years of professional expertise in geospatial analysis, data engineering, and developing production-ready data pipelines is required.
• Advanced proficiency in Python and its geospatial packages, with a proven ability to write optimized, scalable code and independently debug complex programs is required.
• Expert-level experience in Python-based Google Earth Engine (GEE) programming, including building and deploying complex GEE-based data processing and visualization applications is required.
• Knowledge and experience deploying and managing data processing workloads on cloud-based computing platform (e.g. Google Cloud Platform, Azure) is required.
• Knowledge and understanding of key issues and modeling challenges for climate and environmental data is an asset.
• Strong interpersonal skills with internal and external stakeholders.
• Excellent verbal and written communication skills.
• Excellent data management skills (beyond basic Excel, including proficiency with data version control and database concepts).
• Knowledge of climate, environment, and disaster risk reduction concepts and frameworks is an asset.
• Familiarity with analytical frameworks and approaches to children and young people's rights, gender, and inclusion is an asset.
• Fluent in spoken and written English. Additional official UN working languages are a plus.
Requirements:
Completed profile in UNICEF's e-Recruitment system and
- Upload copy of academic credentials
- Financial proposal that will include/ reflect :
the costs per each deliverable and the total lump-sum for the whole assignment (in US$) to undertake the terms of reference.
travel costs and daily subsistence allowance, if internationally recruited or travel is required as per TOR.
Any other estimated costs: visa, health insurance, and living costs as applicable.
Indicate your availability
- Any emergent / unforeseen duty travel and related expenses will be covered by UNICEF.
- At the time the contract is awarded, the selected candidate must have in place current health insurance coverage.
- Payment of professional fees will be based on submission of agreed satisfactory deliverables. UNICEF reserves the right to withhold payment in case the deliverables submitted are not up to the required standard or in case of delays in submitting the deliverables on the part of the consultant.
U.S. Visa information:
With the exception of the US Citizens, G4 Visa and Green Card holders, should the selected candidate and his/her household members reside in the United States under a different visa, the consultant and his/her household members are required to change their visa status to G4, and the consultant's household members (spouse) will require an Employment Authorization Card (EAD) to be able to work, even if he/she was authorized to work under the visa held prior to switching to G4.
Only shortlisted candidates will be contacted and advance to the next stage of the selection process
For every Child, you demonstrate…
UNICEF's core values of Commitment, Diversity and Integrity and core competencies in Communication, Working with People and Drive for Results. View our competency framework at: Here
UNICEF offers reasonable accommodation for consultants/individual contractors with disabilities. This may include, for example, accessible software, travel assistance for missions or personal attendants. We encourage you to disclose your disability during your application in case you need reasonable accommodation during the selection process and afterwards in your assignment.
UNICEF has a zero-tolerance policy on conduct that is incompatible with the aims and objectives of the United Nations and UNICEF, including sexual exploitation and abuse, sexual harassment, abuse of authority and discrimination. UNICEF also adheres to strict child safeguarding principles. All selected candidates will be expected to adhere to these standards and principles and will therefore undergo rigorous reference and background checks. Background checks will include the verification of academic credential(s) and employment history. Selected candidates may be required to provide additional information to conduct a background check.
Remarks:
Individuals engaged under a consultancy will not be considered “staff members” under the Staff Regulations and Rules of the United Nations and UNICEF's policies and procedures and will not be entitled to benefits provided therein (such as leave entitlements and medical insurance coverage). Their conditions of service will be governed by their contract and the General Conditions of Contracts for the Services of Consultants. Consultants are responsible for determining their tax liabilities and for the payment of any taxes and/or duties, in accordance with local or other applicable laws.
The selected candidate is solely responsible to ensure that the visa (applicable) and health insurance required to perform the duties of the contract are valid for the entire period of the contract. Selected candidates are subject to confirmation of fully-vaccinated status against SARS-CoV-2 (Covid-19) with a World Health Organization (WHO)-endorsed vaccine, which must be met prior to taking up the assignment. It does not apply to consultants who will work remotely and are not expected to work on or visit UNICEF premises, programme delivery locations or directly interact with communities UNICEF works with, nor to travel to perform functions for UNICEF for the duration of their consultancy contracts.
MICS Data Harmonization Enhancement and Support for Tabulator Development Consultant, Data Collection Unit, Data and Analytics Section, DATA Team, DAPM, NYHQ, remote. Req#585091
Remote
If you are a committed, creative professional and are passionate about making a lasting difference for children, the world's leading children's rights organization would like to hear from you. For 70 years, UNICEF has been working on the ground in 190 countries and territories to promote children's survival, protection and development. The world's largest provider of vaccines fordeveloping countries, UNICEF supports child health and nutrition, good water and sanitation, quality basic education for all boys and girls, and the protection of children from violence, exploitation, and AIDS. UNICEF is funded entirely by the voluntary contributions of individuals, businesses, foundations and governments. UNICEF has over 12,000 staff in more than 145 countries.
Consultancy: MICS Data Harmonization Enhancement and Support for Tabulator Development Consultant
Duty Station: Data Collection Unit, Data and Analytics Section, DATA Team; DAPM, NYHQ
Duration: 15 Dec 2025 - 10 Nov 2026
Home/ Office Based: Remote
BACKGROUND
Purpose of Activity/ Assignment:
UNICEF, as mandated by the United Nations General Assembly, is dedicated to advocating for the rights of every child, meeting their basic needs, and creating opportunities for their full development. A cornerstone of this mission is the Multiple Indicator Cluster Surveys (MICS) program - the largest source of internationally comparable data on children and women worldwide. MICS provides vital evidence for policymaking, program design, and progress monitoring toward global development commitments, including the Sustainable Development Goals (SDGs). Covering a wide range of thematic areas, MICS remains a key instrument for evidence-based decision-making at both national and international levels.
While MICS data is publicly accessible, effective use of it often requires advanced statistical tools and expertise, which can limit its reach among policymakers and practitioners. To address this, UNICEF is developing the MICS Tabulator - an online platform designed to make MICS data easier to access, analyze, and visualize. The Tabulator will enable users to generate customized tabulations, pivot views, indicators, and visualizations directly online, without needing to download microdata or use specialized statistical software. By enhancing accessibility, the platform will empower policymakers, researchers, and development partners to leverage MICS data more effectively for informed action.
The MICS Tabulator will play a pivotal role in broadening data use and dissemination. It will ensure that key insights on the well-being of children and women are accessible to a wider audience while maintaining international comparability and rigorous quality standards. In parallel, it will strengthen national capacity by enabling National Statistical Offices (NSOs) to conduct child-focused surveys with increasing autonomy, requiring only limited technical support. Ultimately, this initiative supports UNICEF's overarching goal of advancing evidence-based policymaking and improving outcomes for children and families worldwide.
MICS surveys generate extensive datasets covering health, education, nutrition, child protection, and gender equality. Over time, adjustments to survey instruments and country-specific adaptations have led to structural variations across datasets, making cross-country and trend analyses more complex. To address these challenges, UNICEF has partnered with IPUMS at the University of Minnesota to harmonize MICS datasets across multiple rounds. This collaboration ensures consistency and comparability across countries and time, forming the backbone of the MICS Tabulator's harmonized database.
Through this partnership, IPUMS has successfully harmonized 1,207 MICS datasets, utilizing translation tables, programming files, and SPSS codebooks to map variables into a unified framework. While highly effective, the current process relies heavily on external expertise. To ensure long-term sustainability and institutional capacity within UNICEF, a consultant will be engaged to review the existing harmonization workflows, documentation, and outputs. The consultant will help define a sustainable harmonization strategy for future MICS rounds - including standardized procedures, tools, and guidance for the MICS team - and support technical collaboration with the MICS Tabulator vendor by reviewing selected parts of the codebase and providing recommendations for improvement.
Scope of Work:
The consultant will build on the harmonization work already completed by IPUMS and support UNICEF in establishing a sustainable, in-house capacity to manage and extend data harmonization for future MICS survey rounds. IPUMS has developed a comprehensive library of translation tables and scripts used to convert raw MICS data into harmonized
datasets. Maintaining and adapting these tools requires specialized knowledge. The consultant will review existing processes, document them clearly, and design a streamlined and future-proof approach for continued harmonization.
The assignment requires a solid understanding of household survey methodologies, MICS-specific data processing workflows, and statistical programming tools. Working in close collaboration with UNICEF's technical teams, the consultant will ensure that the MICS Tabulator is built upon well-structured, standardized, and high-quality data - enhancing accessibility and usability for all stakeholders.
Specific Roles and Responsibilities
1. Develop Templates and Guidance for Amendments
Create templates and workflows for incorporating new variables or updates in translation tables.
Produce detailed guidance for mapping new survey-specific variables to harmonized variable names and structures.
Document illustrative examples of common amendments (e.g., new household characteristics, revised education categories).
Provide training materials or recorded walkthroughs to facilitate internal capacity building.
2. Establish Processes and Tools for Future Harmonization
Review IPUMS-produced code, translation tables, and harmonized outputs.
Develop an efficient methodology and/or software scripts for processing new datasets and aligning them with the harmonized structure.
Integrate automated validation and quality control checks into the harmonization workflow.
Deliver comprehensive technical documentation and user manuals for UNICEF staff.
Conduct training sessions or provide recorded materials to ensure sustainable knowledge transfer.
3. Support Vendor Code Review
Provide technical support to the MICS Tabulator development vendor (Nagarro) by reviewing code related to dataset integration and harmonization.
Identify and recommend improvements to enhance efficiency, maintainability, and alignment with UNICEF data standards.
Collaborate with the vendor's development team to ensure smooth integration of harmonized datasets, contributing to selected code modules where necessary.
4. MICS Standard CAPI Listing and Mapping Application
Develop a CSPro-based application for preparing MICS standard CAPI listings and mapping.
Ensure the application automates key steps such as extracting data from survey inputs, mapping variables to the standardized structure, and performing basic validation checks to reduce manual work and errors
Terms of Reference / Key Deliverables:
Work Assignment Overview/Deliverables and Outputs/Delivery deadline
1. Amendment Templates, Guidance, and Training Materials
- Standardized template with a detailed, step-by-step workflow for incorporating new variables or updating existing entries in translation tables for harmonized datasets
including instructions, example entries, and a validation checklist.
- Documented template with illustrative examples of workflows for adding new variables or updating translation tables for harmonized datasets.
- Prepared training materials and/or recorded walkthroughs to facilitate internal capacity building of the MICS team
30 Apr 2026
2. Harmonization Process Package
- Review report of IPUMS-produced code, translation tables, and harmonized outputs to ensure alignment with UNICEF standards.
- Methodology documentation and 3-5 software scripts for processing new datasets and harmonizing them with the established structure.
- Automated validation and quality control checks integrated into the harmonization workflow.
- Documented technical guidance and user manuals for UNICEF staff.
- At least 3 training sessions and/or recorded materials to ensure sustainable knowledge transfer.
30 Sept 2026
3. MICS Tabulator Vendor Code Review
- Code review report for MICS Tabulator modules related to dataset integration and harmonization.
- Written document with recommendations to enhance efficiency,
maintainability, and alignment with UNICEF data standards.
- Final report and supporting documentation of contributions and collaboration with the vendor's development team for the integration of harmonized datasets
10 Nov 2026
4. MICS Standard CAPI Listing and Mapping Application
- Standard MICS CAPI listings and mapping application developed in CSPro
30 May 2026
Travel: One trip may be undertaken to meet with the MICS Tabulator vendor for coordination or progress review, if required and approved in advance by UNICEF
Qualifications
Education:
Information Technologies, Statistics, Demography, or any other related technical field with expertise in data management
Language Proficiency:
Good communication skills in English
Knowledge/Expertise/Skills required *:
At least a Master's Degree or equivalent in Information Technologies, Statistics, Demography, or any other related technical field with expertise in data management.
Minimum ten years' working experience in data processing in household surveys, preferably with prior MICS or DHS data processing experience.
Expertise in programming with CSPro.
Expertise in programming with SPSS and R.
Strong IT and software development skills, including experience in reviewing and understanding vendor code for project development and integration.
Excellent interpersonal skills
Requirements:
Completed profile in UNICEF's e-Recruitment system and
- Upload copy of academic credentials
- Financial proposal that will include/ reflect :
the costs per each deliverable and the total lump-sum for the whole assignment (in US$) to undertake the terms of reference.
travel costs and daily subsistence allowance, if internationally recruited or travel is required as per TOR.
Any other estimated costs: visa, health insurance, and living costs as applicable.
Indicate your availability
- Any emergent / unforeseen duty travel and related expenses will be covered by UNICEF.
- At the time the contract is awarded, the selected candidate must have in place current health insurance coverage.
- Payment of professional fees will be based on submission of agreed satisfactory deliverables. UNICEF reserves the right to withhold payment in case the deliverables submitted are not up to the required standard or in case of delays in submitting the deliverables on the part of the consultant.
U.S. Visa information:
With the exception of the US Citizens, G4 Visa and Green Card holders, should the selected candidate and his/her household members reside in the United States under a different visa, the consultant and his/her household members are required to change their visa status to G4, and the consultant's household members (spouse) will require an Employment Authorization Card (EAD) to be able to work, even if he/she was authorized to work under the visa held prior to switching to G4.
Only shortlisted candidates will be contacted and advance to the next stage of the selection process
For every Child, you demonstrate…
UNICEF's core values of Commitment, Diversity and Integrity and core competencies in Communication, Working with People and Drive for Results. View our competency framework at: Here
UNICEF offers reasonable accommodation for consultants/individual contractors with disabilities. This may include, for example, accessible software, travel assistance for missions or personal attendants. We encourage you to disclose your disability during your application in case you need reasonable accommodation during the selection process and afterwards in your assignment.
UNICEF has a zero-tolerance policy on conduct that is incompatible with the aims and objectives of the United Nations and UNICEF, including sexual exploitation and abuse, sexual harassment, abuse of authority and discrimination. UNICEF also adheres to strict child safeguarding principles. All selected candidates will be expected to adhere to these standards and principles and will therefore undergo rigorous reference and background checks. Background checks will include the verification of academic credential(s) and employment history. Selected candidates may be required to provide additional information to conduct a background check.
Remarks:
Individuals engaged under a consultancy will not be considered “staff members” under the Staff Regulations and Rules of the United Nations and UNICEF's policies and procedures and will not be entitled to benefits provided therein (such as leave entitlements and medical insurance coverage). Their conditions of service will be governed by their contract and the General Conditions of Contracts for the Services of Consultants. Consultants are responsible for determining their tax liabilities and for the payment of any taxes and/or duties, in accordance with local or other applicable laws.
The selected candidate is solely responsible to ensure that the visa (applicable) and health insurance required to perform the duties of the contract are valid for the entire period of the contract. Selected candidates are subject to confirmation of fully-vaccinated status against SARS-CoV-2 (Covid-19) with a World Health Organization (WHO)-endorsed vaccine, which must be met prior to taking up the assignment. It does not apply to consultants who will work remotely and are not expected to work on or visit UNICEF premises, programme delivery locations or directly interact with communities UNICEF works with, nor to travel to perform functions for UNICEF for the duration of their consultancy contracts.
Data Science Architect, Member Experience
Remote
A bit about this role
As a Data Science Architect on Devoted's Member Experience team, you'll join a small, senior, high-trust group at the intersection of data science, operations, and AI infrastructure. Your work will directly shape how our members experience care, from the conversations they have with our guides to the technology that supports those interactions.
You'll design, build, and evolve the data and ML systems that help our service teams deliver empathetic, efficient, and intelligent support. Your work will span applied machine learning, LLM evaluation, forecasting, and data architecture, always with a focus on measurable business impact and ethical use of automation.
This high-impact role reports into Devoted's Data Science team and partners closely with Product Managers, Software Engineers, and Member Experience leaders to solve high-leverage problems for the company. You'll help define the technical and strategic direction for how Devoted uses data to understand and improve every member interaction.
In this role, you will:
Design first-class scorecards for our Member Service Guides - applying data and behavioral science to reward quality, empathy, and resolution, not just efficiency metrics.
Own and evolve our internal workforce management platform, helping us move from reactive scheduling to predictive staffing, improving cost, service quality, and employee satisfaction.
Drive Devoted's LLM platform infrastructure, developing intelligent agents, evaluation frameworks, and observability tooling to ensure our AI systems are explainable, measurable, and trustworthy.
Own and advance forecasting models that underpin key business decisions - from staffing and budgets to member growth projections - turning data into foresight, not just hindsight.
Collaborate deeply with engineering partners to ensure scalable, maintainable data pipelines and feature stores, while maintaining fast feedback loops with business stakeholders.
Mentor and learn from exceptional peers, fostering a culture that values curiosity, rigor, and speed in equal measure.
Required skills and experience
7+ years as a senior technical individual contributor in data science or ML architecture roles.
Strong background in Python and SQL, with fluency in modern data ecosystems (Snowflake, dbt, Looker).
Experience designing, building, and deploying machine learning models and LLM-based systems in production environments.
Demonstrated ability to translate ambiguous business problems into technical solutions with measurable impact.
Comfort building and optimizing scalable data pipelines, and performing deep-dive statistical and predictive analysis.
Excellent communication skills and the ability to partner effectively with non-technical stakeholders.
Desired skills and experience:
Member service operations, workforce management, or customer experience analytics.
Healthcare or regulated data environments, especially Medicare Advantage.
LLM evaluation frameworks, human-in-the-loop design, or AI ethics initiatives.
Modern data architecture patterns (event-driven design and feature stores)
Why this role is different
Short path to impact. You'll work with world-class data and application engineers, no months lost to plumbing pipelines.
Ethics aren't a footnote. You'll set the standard for fairness, explainability, and responsible automation across our AI stack.
Senior peers, no bureaucracy. You'll join a team that runs fast, thinks deeply, and values humility as much as brilliance.
Flexibility by design. Whether your superpower lies in modeling, architecture, or leadership, this role flexes to amplify it.
Salary range: $163,000-$243,000 base salary
#LI-TM1
Our ranges are purposefully broad to allow for growth within the role over time. Once the interview process begins, your talent partner will provide additional information on the compensation for the role, along with additional information on our total rewards package. The actual base salary offered may depend on a variety of factors, including the qualifications of the individual applicant for the position, years of relevant experience, specific and unique skills, level of education attained, certifications or other professional licenses held, and the location in which the applicant lives and/or from which they will be performing the job.
Our Total Rewards package includes:
Employer sponsored health, dental and vision plan with low or no premium
Generous paid time off
$100 monthly mobile or internet stipend
Stock options for all employees
Bonus eligibility for all roles excluding Director and above; Commission eligibility for Sales roles
Parental leave program
401K program
And more....
*Our total rewards package is for full time employees only. Intern and Contract positions are not eligible.
Healthcare equality is at the center of Devoted's mission to treat our members like family. We are committed to a diverse and vibrant workforce.
At Devoted Health, we're on a mission to dramatically improve the health and well-being of older Americans by caring for every person like family. That's why we're gathering smart, diverse, and big-hearted people to create a new kind of all-in-one healthcare company - one that combines compassion, health insurance, clinical care, service, and technology - to deliver a complete and integrated healthcare solution that delivers high quality care that everyone would want for someone they love. Founded in 2017, we've grown fast and now serve members across the United States. And we've just started. So join us on this mission!
Devoted is an equal opportunity employer. We are committed to a safe and supportive work environment in which all employees have the opportunity to participate and contribute to the success of the business. We value diversity and collaboration. Individuals are respected for their skills, experience, and unique perspectives. This commitment is embodied in Devoted's Code of Conduct, our company values and the way we do business.
As an Equal Opportunity Employer, the Company does not discriminate on the basis of race, color, religion, sex, pregnancy status, marital status, national origin, disability, age, sexual orientation, veteran status, genetic information, gender identity, gender expression, or any other factor prohibited by law. Our management team is dedicated to this policy with respect to recruitment, hiring, placement, promotion, transfer, training, compensation, benefits, employee activities and general treatment during employment.
Auto-ApplyData Architect
Portland, ME jobs
Description & Requirements Maximus is looking for an experienced Data Architect to lead the design and implementation of modern, scalable DevOps solutions. In this role, you'll drive automation, containerization, and continuous delivery practices across enterprise systems.
If you're a technical expert with a passion for innovation, collaboration, and building high-performing environments, join us and help shape the future of digital transformation.
***This is a fully remote position. Requires 10% travel. 100% mileage reimbursed at federal rate***
Why Join Maximus?
- • Competitive Compensation - Quarterly bonuses based on performance included!
- • Comprehensive Insurance Coverage - Choose from various plans, including Medical, Dental, Vision, Prescription, and partially funded HSA. Additionally, enjoy Life insurance benefits and discounts on Auto, Home, Renter's, and Pet insurance.
- • Future Planning - Prepare for retirement with our 401K Retirement Savings plan and Company Matching.
- • Unlimited Time Off Package - Enjoy UTO, Holidays, and sick leave,
- • Holistic Wellness Support - Access resources for physical, emotional, and financial wellness through our Employee Assistance Program (EAP).
- • Recognition Platform - Acknowledge and appreciate outstanding employee contributions.
- • Tuition Reimbursement - Invest in your ongoing education and development.
- • Employee Perks and Discounts - Additional benefits and discounts exclusively for employees.
- • Maximus Wellness Program and Resources - Access a range of wellness programs and resources tailored to your needs.
- • Professional Development Opportunities- Participate in training programs, workshops, and conferences.
Essential Duties and Responsibilities:
- Define, develop, and implement the configuration management system which supports the enterprise software development life cycle (SDLC).
- Manage source code within the Version Control System (branching, sync, merge, etc.), compile, assemble, and package software from source code; mentor less senior team members in this discipline.
- Work with client to perform and validate installations, upgrades, deployments, and containers.
- Define and provide guidance on standards and best practices.
- Develop automation scripts for build, deployment, and versioning activities; mentor less senior team members in this discipline.
- Research and resolve technical problems associated with the version control and continuous integration systems.
- Typically responsible for providing guidance, coaching, and training to other employees within job area.
Minimum Requirements
- Bachelor's degree in relevant field of study and 7+ years of relevant professional experience required, or equivalent combination of education and experience.
-Database management experience preferred
- M.M.I.S. experience preferred
- Data conversion experience preferred
-Technical leadership experience preferred
-Technical oversight experience preferred
#LI-Remote
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
150,000.00
Maximum Salary
$
175,000.00
Easy ApplyData Architect
Manchester, NH jobs
Description & Requirements Maximus is looking for an experienced Data Architect to lead the design and implementation of modern, scalable DevOps solutions. In this role, you'll drive automation, containerization, and continuous delivery practices across enterprise systems.
If you're a technical expert with a passion for innovation, collaboration, and building high-performing environments, join us and help shape the future of digital transformation.
***This is a fully remote position. Requires 10% travel. 100% mileage reimbursed at federal rate***
Why Join Maximus?
- • Competitive Compensation - Quarterly bonuses based on performance included!
- • Comprehensive Insurance Coverage - Choose from various plans, including Medical, Dental, Vision, Prescription, and partially funded HSA. Additionally, enjoy Life insurance benefits and discounts on Auto, Home, Renter's, and Pet insurance.
- • Future Planning - Prepare for retirement with our 401K Retirement Savings plan and Company Matching.
- • Unlimited Time Off Package - Enjoy UTO, Holidays, and sick leave,
- • Holistic Wellness Support - Access resources for physical, emotional, and financial wellness through our Employee Assistance Program (EAP).
- • Recognition Platform - Acknowledge and appreciate outstanding employee contributions.
- • Tuition Reimbursement - Invest in your ongoing education and development.
- • Employee Perks and Discounts - Additional benefits and discounts exclusively for employees.
- • Maximus Wellness Program and Resources - Access a range of wellness programs and resources tailored to your needs.
- • Professional Development Opportunities- Participate in training programs, workshops, and conferences.
Essential Duties and Responsibilities:
- Define, develop, and implement the configuration management system which supports the enterprise software development life cycle (SDLC).
- Manage source code within the Version Control System (branching, sync, merge, etc.), compile, assemble, and package software from source code; mentor less senior team members in this discipline.
- Work with client to perform and validate installations, upgrades, deployments, and containers.
- Define and provide guidance on standards and best practices.
- Develop automation scripts for build, deployment, and versioning activities; mentor less senior team members in this discipline.
- Research and resolve technical problems associated with the version control and continuous integration systems.
- Typically responsible for providing guidance, coaching, and training to other employees within job area.
Minimum Requirements
- Bachelor's degree in relevant field of study and 7+ years of relevant professional experience required, or equivalent combination of education and experience.
-Database management experience preferred
- M.M.I.S. experience preferred
- Data conversion experience preferred
-Technical leadership experience preferred
-Technical oversight experience preferred
#LI-Remote
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
150,000.00
Maximum Salary
$
175,000.00
Easy ApplyAssociate Architect - Oracle PaaS Administrator
Concord, NH jobs
*****CANDIDATE MUST BE US Citizen (due to contractual/access requirements)***** We are seeking a highly skilled and experienced Oracle Platform as a Service (PaaS) Administrator to join our dynamic team. This pivotal role involves leading the definition and design of complex Oracle Financial PaaS processes and functions, facilitating the development of sophisticated enterprise business solutions, and contributing to strategic initiatives. The successful candidate will play a critical role in ensuring the robust, secure, and efficient operation of our Oracle PaaS environment, balancing functional requirements with service quality and adherence to enterprise policies and security standards. This role also involves providing leadership and mentorship in areas of expertise and architecture to peers, developers, management, and business users.
**Key Responsibilities:**
+ Lead the design and definition of complex Oracle Financial PaaS processes and functions.
+ Facilitate the development of advanced enterprise business solutions utilizing Oracle PaaS.
+ Contribute to enterprise strategy development, including opportunity identification and business innovation.
+ Select and ensure the effective application of appropriate design standards, methods, and tools.
+ Review application designs to ensure optimal technology selection, efficient resource utilization, and seamless system integration.
+ Ensure system architecture adheres to functional, service quality, security, and enterprise policy standards.
+ Participate in customer walkthroughs, technical reviews, problem resolution, and decision-making processes.
+ Provide leadership and mentorship to peers, developers, management, and business users on Oracle PaaS architecture and best practices.
+ Manage Oracle Cloud Infrastructure (OCI) resources, including provisioning and maintaining compute, storage (Object Storage, Block Volumes), and networking components (VCNs, subnets, NSGs, security lists).
+ Administer and configure Oracle PaaS services such as Oracle Integration Cloud (OIC), Autonomous Database (ADW, ATP), Oracle Analytics Cloud, and Oracle FDI, ensuring secure and efficient operation.
+ Implement and manage Identity and Access Management (IAM) through IDCS or OCI IAM, including role setup, policies, single sign-on (SSO), and application/user provisioning.
+ Conduct proactive monitoring, performance tuning, and cost optimization of Oracle PaaS environments.
+ Implement and enforce security best practices, including encryption, patch management, vulnerability scanning, backup/recovery, access audits, Cloud Guard, and Data Safe, ensuring SOX compliance.
+ Provide frontline support for incident management, diagnosing and resolving platform issues, coordinating with IT teams and vendors, and documenting operational processes.
+ Develop and maintain automation scripts (Shell, Python) for streamlining tasks, ensuring peer review and version control.
+ Maintain comprehensive technical documentation, oversee licensing, manage change control, and develop recovery plans.
+ Collaborate effectively with developers, analysts, and security teams, and potentially mentor junior staff.
**Core Skills and Experience:**
+ **Oracle Cloud Infrastructure (OCI) Expertise:** Compute, Storage (Object Storage, Block Volumes), Networking (VCNs, subnets, NSGs).
+ **Oracle PaaS Services:** Oracle Integration Cloud (OIC), Oracle Data Integrator (ODI), Identity Cloud Service (IDCS), Autonomous Database (ADW, ATP), Oracle Analytics Cloud (OAC), Visual Builder Cloud Service (VBCS), APEX, WebLogic.
+ **Database Administration:** Oracle Database administration and data transformation experience.
+ **Scripting & Automation:** Proficiency in Shell/Bash scripting and Python. Java experience is a plus.
+ **Security & Compliance:** IAM/Policy configuration, encryption, patching strategies, SOX compliance, and audit experience.
+ **APIs & Integrations:** Experience with REST APIs and FDI SOAP.
+ **Performance Optimization:** Proven ability in performance tuning, health checks, and cost tracking.
+ **Operational Excellence:** Strong skills in incident triaging, technical documentation, backup strategies, and disaster recovery.
**ESSENTIAL RESPONSIBILITIES**
+ Assists in providing strategic consultation to business customers in defining or designing less complex business processes, functions and organizational structures, as well as in researching, identifying and internally marketing enabling technologies based on customer capability requirements. Facilitates development of enterprise business solutions that combine knowledge of particular business processes and issues, general technological options, and process facilitation techniques. Participates in enterprise strategy development, including environmental analysis, opportunity identification, value cases and business innovation portfolio development.
+ Assists in specifying and designing less complex systems, solutions, networks, infrastructure elements, or processes. Selects appropriate design standards, methods and tools and ensures that they are applied effectively. Reviews others' system design to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Establishes policy for selection of architecture components. Evaluates and undertakes impact analysis on major design options. Ensures that the system architecture balances functional, service quality and systems management requirements.
+ Assists in using appropriate tools, including models of components and interfaces, to contribute to the development of architectures. Produces detailed component requirements, specifications and translates these into detailed solutions/designs for implementation using selected products. Provides advice on technical aspects of system development, integration (including requests for changes, deviations from specifications, etc.) and processes. Ensures that relevant technical and business strategies, policies, standards and practices are applied correctly.
+ Assists in selecting and using tools and methods to establish, clarify, and communicate the functional and non-functional requirements of system users, their characteristics, and tasks. Identifies the technical, organizational, and physical environment in which less complex products or systems will operate. Identifies, proposes, initiates, and leads improvement programs, taking responsibility for the quality and appropriateness of the work performed and the realization of measurable business benefits. Modifies existing process improvement approaches and/or develops new approaches to achieving improvement.
+ Assists in ensuring the resolution of a variety of architecture and business problems and serves as a technical or business resource for less complex project initiatives.
+ Communicates effectively with all levels of organization
+ Manages expectations of customers, partners and management
+ Participates in customer walkthroughs and plans; design and technical walkthroughs; and problem resolution and decision making
+ Interacts with departments across the organization as necessary, including the development and interpretation of less complex requirements for peers and other staff.
+ Maintains an in-depth knowledge of specific technical aspects in area of expertise and provides advice regarding their application. The area of specific expertise may be any aspect of information or communication technology, technique, method, process, product, or application area.
+ Provides leadership in the areas of expertise and architecture to their peers, developers, management and business users including technical expertise, coaching, and ad-hoc training by:
+ Preparing presentations on less complex issues on the area of expertise
+ Presenting to their peers to ensure consistency to Highmark's strategic direction.
+ Other duties as assigned or requested.
**EDUCATION**
**Required**
+ Bachelor's Degree in Information Technology or related field
**Substitutions**
+ 6 years of related experience in lieu of a 4 year degree
**Preferred**
+ Master's Degree
**EXPERIENCE**
**Required**
+ None
**Preferred**
+ Health insurance industry business knowledge
**LICENSES or CERTIFICATIONS**
**Required**
+ None
**Preferred**
+ Industry certifications
**SKILLS**
An Architect is not required to have experience in all of these, but is required to have those needed to support applications they are responsible for supporting. Current skill set are reviewed every other year, new skills may be required to meet changing business needs.
+ Skills:
+ IMS, DB2, Oracle and Teradata Databases, Data Warehousing
+ COBAL, Visual Basic, C C++, SAS
+ Java/JavaScript Framework
+ PEGA, CSS3, Mobile, JSON, Cognos, Hadoop, SQL, J2EE, HTML5/XML
+ Project Management Tools:
+ Waterfall
+ Agile
+ Certification in application areas such as:
+ Java Developer
+ DB2, Cogno, PEGA, Enterprise Architect(SCEA), Project Management
**PHYSICAL, MENTAL DEMANDS and WORKING CONDITIONS**
**Position Type**
Office-based
Teaches / trains others regularly
Occasionally
Travel regularly from the office to various work sites or from site-to-site
Rarely
Works primarily out-of-the office selling products/services (sales employees)
Never
Physical work site required
Yes
Lifting: up to 10 pounds
Constantly
Lifting: 10 to 25 pounds
Occasionally
Lifting: 25 to 50 pounds
Never
**_Disclaimer:_** _The job description has been designed to indicate the general nature and essential duties and responsibilities of work performed by employees within this job title. It may not contain a comprehensive inventory of all duties, responsibilities, and qualifications required of employees to do this job._
**_Compliance Requirement_** _: This job adheres to the ethical and legal standards and behavioral expectations as set forth in the code of business conduct and company policies._
_As a component of job responsibilities, employees may have access to covered information, cardholder data, or other confidential customer information that must be protected at all times. In connection with this, all employees must comply with both the Health Insurance Portability Accountability Act of 1996 (HIPAA) as described in the Notice of Privacy Practices and Privacy Policies and Procedures as well as all data security guidelines established within the Company's Handbook of Privacy Policies and Practices and Information Security Policy._
_Furthermore, it is every employee's responsibility to comply with the company's Code of Business Conduct. This includes but is not limited to adherence to applicable federal and state laws, rules, and regulations as well as company policies and training requirements._
**Pay Range Minimum:**
$57,700.00
**Pay Range Maximum:**
$107,800.00
_Base pay is determined by a variety of factors including a candidate's qualifications, experience, and expected contributions, as well as internal peer equity, market, and business considerations. The displayed salary range does not reflect any geographic differential Highmark may apply for certain locations based upon comparative markets._
Highmark Health and its affiliates prohibit discrimination against qualified individuals based on their status as protected veterans or individuals with disabilities and prohibit discrimination against all individuals based on any category protected by applicable federal, state, or local law.
We endeavor to make this site accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact the email below.
For accommodation requests, please contact HR Services Online at *****************************
California Consumer Privacy Act Employees, Contractors, and Applicants Notice
Req ID: J273116
Associate Architect - Oracle PaaS Administrator
Augusta, ME jobs
*****CANDIDATE MUST BE US Citizen (due to contractual/access requirements)***** We are seeking a highly skilled and experienced Oracle Platform as a Service (PaaS) Administrator to join our dynamic team. This pivotal role involves leading the definition and design of complex Oracle Financial PaaS processes and functions, facilitating the development of sophisticated enterprise business solutions, and contributing to strategic initiatives. The successful candidate will play a critical role in ensuring the robust, secure, and efficient operation of our Oracle PaaS environment, balancing functional requirements with service quality and adherence to enterprise policies and security standards. This role also involves providing leadership and mentorship in areas of expertise and architecture to peers, developers, management, and business users.
**Key Responsibilities:**
+ Lead the design and definition of complex Oracle Financial PaaS processes and functions.
+ Facilitate the development of advanced enterprise business solutions utilizing Oracle PaaS.
+ Contribute to enterprise strategy development, including opportunity identification and business innovation.
+ Select and ensure the effective application of appropriate design standards, methods, and tools.
+ Review application designs to ensure optimal technology selection, efficient resource utilization, and seamless system integration.
+ Ensure system architecture adheres to functional, service quality, security, and enterprise policy standards.
+ Participate in customer walkthroughs, technical reviews, problem resolution, and decision-making processes.
+ Provide leadership and mentorship to peers, developers, management, and business users on Oracle PaaS architecture and best practices.
+ Manage Oracle Cloud Infrastructure (OCI) resources, including provisioning and maintaining compute, storage (Object Storage, Block Volumes), and networking components (VCNs, subnets, NSGs, security lists).
+ Administer and configure Oracle PaaS services such as Oracle Integration Cloud (OIC), Autonomous Database (ADW, ATP), Oracle Analytics Cloud, and Oracle FDI, ensuring secure and efficient operation.
+ Implement and manage Identity and Access Management (IAM) through IDCS or OCI IAM, including role setup, policies, single sign-on (SSO), and application/user provisioning.
+ Conduct proactive monitoring, performance tuning, and cost optimization of Oracle PaaS environments.
+ Implement and enforce security best practices, including encryption, patch management, vulnerability scanning, backup/recovery, access audits, Cloud Guard, and Data Safe, ensuring SOX compliance.
+ Provide frontline support for incident management, diagnosing and resolving platform issues, coordinating with IT teams and vendors, and documenting operational processes.
+ Develop and maintain automation scripts (Shell, Python) for streamlining tasks, ensuring peer review and version control.
+ Maintain comprehensive technical documentation, oversee licensing, manage change control, and develop recovery plans.
+ Collaborate effectively with developers, analysts, and security teams, and potentially mentor junior staff.
**Core Skills and Experience:**
+ **Oracle Cloud Infrastructure (OCI) Expertise:** Compute, Storage (Object Storage, Block Volumes), Networking (VCNs, subnets, NSGs).
+ **Oracle PaaS Services:** Oracle Integration Cloud (OIC), Oracle Data Integrator (ODI), Identity Cloud Service (IDCS), Autonomous Database (ADW, ATP), Oracle Analytics Cloud (OAC), Visual Builder Cloud Service (VBCS), APEX, WebLogic.
+ **Database Administration:** Oracle Database administration and data transformation experience.
+ **Scripting & Automation:** Proficiency in Shell/Bash scripting and Python. Java experience is a plus.
+ **Security & Compliance:** IAM/Policy configuration, encryption, patching strategies, SOX compliance, and audit experience.
+ **APIs & Integrations:** Experience with REST APIs and FDI SOAP.
+ **Performance Optimization:** Proven ability in performance tuning, health checks, and cost tracking.
+ **Operational Excellence:** Strong skills in incident triaging, technical documentation, backup strategies, and disaster recovery.
**ESSENTIAL RESPONSIBILITIES**
+ Assists in providing strategic consultation to business customers in defining or designing less complex business processes, functions and organizational structures, as well as in researching, identifying and internally marketing enabling technologies based on customer capability requirements. Facilitates development of enterprise business solutions that combine knowledge of particular business processes and issues, general technological options, and process facilitation techniques. Participates in enterprise strategy development, including environmental analysis, opportunity identification, value cases and business innovation portfolio development.
+ Assists in specifying and designing less complex systems, solutions, networks, infrastructure elements, or processes. Selects appropriate design standards, methods and tools and ensures that they are applied effectively. Reviews others' system design to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Establishes policy for selection of architecture components. Evaluates and undertakes impact analysis on major design options. Ensures that the system architecture balances functional, service quality and systems management requirements.
+ Assists in using appropriate tools, including models of components and interfaces, to contribute to the development of architectures. Produces detailed component requirements, specifications and translates these into detailed solutions/designs for implementation using selected products. Provides advice on technical aspects of system development, integration (including requests for changes, deviations from specifications, etc.) and processes. Ensures that relevant technical and business strategies, policies, standards and practices are applied correctly.
+ Assists in selecting and using tools and methods to establish, clarify, and communicate the functional and non-functional requirements of system users, their characteristics, and tasks. Identifies the technical, organizational, and physical environment in which less complex products or systems will operate. Identifies, proposes, initiates, and leads improvement programs, taking responsibility for the quality and appropriateness of the work performed and the realization of measurable business benefits. Modifies existing process improvement approaches and/or develops new approaches to achieving improvement.
+ Assists in ensuring the resolution of a variety of architecture and business problems and serves as a technical or business resource for less complex project initiatives.
+ Communicates effectively with all levels of organization
+ Manages expectations of customers, partners and management
+ Participates in customer walkthroughs and plans; design and technical walkthroughs; and problem resolution and decision making
+ Interacts with departments across the organization as necessary, including the development and interpretation of less complex requirements for peers and other staff.
+ Maintains an in-depth knowledge of specific technical aspects in area of expertise and provides advice regarding their application. The area of specific expertise may be any aspect of information or communication technology, technique, method, process, product, or application area.
+ Provides leadership in the areas of expertise and architecture to their peers, developers, management and business users including technical expertise, coaching, and ad-hoc training by:
+ Preparing presentations on less complex issues on the area of expertise
+ Presenting to their peers to ensure consistency to Highmark's strategic direction.
+ Other duties as assigned or requested.
**EDUCATION**
**Required**
+ Bachelor's Degree in Information Technology or related field
**Substitutions**
+ 6 years of related experience in lieu of a 4 year degree
**Preferred**
+ Master's Degree
**EXPERIENCE**
**Required**
+ None
**Preferred**
+ Health insurance industry business knowledge
**LICENSES or CERTIFICATIONS**
**Required**
+ None
**Preferred**
+ Industry certifications
**SKILLS**
An Architect is not required to have experience in all of these, but is required to have those needed to support applications they are responsible for supporting. Current skill set are reviewed every other year, new skills may be required to meet changing business needs.
+ Skills:
+ IMS, DB2, Oracle and Teradata Databases, Data Warehousing
+ COBAL, Visual Basic, C C++, SAS
+ Java/JavaScript Framework
+ PEGA, CSS3, Mobile, JSON, Cognos, Hadoop, SQL, J2EE, HTML5/XML
+ Project Management Tools:
+ Waterfall
+ Agile
+ Certification in application areas such as:
+ Java Developer
+ DB2, Cogno, PEGA, Enterprise Architect(SCEA), Project Management
**PHYSICAL, MENTAL DEMANDS and WORKING CONDITIONS**
**Position Type**
Office-based
Teaches / trains others regularly
Occasionally
Travel regularly from the office to various work sites or from site-to-site
Rarely
Works primarily out-of-the office selling products/services (sales employees)
Never
Physical work site required
Yes
Lifting: up to 10 pounds
Constantly
Lifting: 10 to 25 pounds
Occasionally
Lifting: 25 to 50 pounds
Never
**_Disclaimer:_** _The job description has been designed to indicate the general nature and essential duties and responsibilities of work performed by employees within this job title. It may not contain a comprehensive inventory of all duties, responsibilities, and qualifications required of employees to do this job._
**_Compliance Requirement_** _: This job adheres to the ethical and legal standards and behavioral expectations as set forth in the code of business conduct and company policies._
_As a component of job responsibilities, employees may have access to covered information, cardholder data, or other confidential customer information that must be protected at all times. In connection with this, all employees must comply with both the Health Insurance Portability Accountability Act of 1996 (HIPAA) as described in the Notice of Privacy Practices and Privacy Policies and Procedures as well as all data security guidelines established within the Company's Handbook of Privacy Policies and Practices and Information Security Policy._
_Furthermore, it is every employee's responsibility to comply with the company's Code of Business Conduct. This includes but is not limited to adherence to applicable federal and state laws, rules, and regulations as well as company policies and training requirements._
**Pay Range Minimum:**
$57,700.00
**Pay Range Maximum:**
$107,800.00
_Base pay is determined by a variety of factors including a candidate's qualifications, experience, and expected contributions, as well as internal peer equity, market, and business considerations. The displayed salary range does not reflect any geographic differential Highmark may apply for certain locations based upon comparative markets._
Highmark Health and its affiliates prohibit discrimination against qualified individuals based on their status as protected veterans or individuals with disabilities and prohibit discrimination against all individuals based on any category protected by applicable federal, state, or local law.
We endeavor to make this site accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact the email below.
For accommodation requests, please contact HR Services Online at *****************************
California Consumer Privacy Act Employees, Contractors, and Applicants Notice
Req ID: J273116