Master Data Coordinator
Data analyst job in Orem, UT
Bucked Up is looking for a seasoned and self-directed Master Data Specialist to help chart the company's Master Data Management roadmap, enable our consumption strategy for master data, drive data quality, and support data governance processes. The Master Data Specialist focus is to help drive innovation and deliver solutions that change the way we do business and move Bucked Up into the future and will work closely with technical and business counterparts. Successful candidates will be highly self-directed, excellent problem-solvers and be able to work with all levels of the organization. Attention to detail, strong work ethic and excellent communication skills are required. This role is hands-on and requires both technical depth and strong capabilities to understand business concepts. Candidates should have exceptional functional and technical skills in master data implementations, data engineering, data stewardship, and application integration.
The Company
DAS Labs, the owner of Bucked Up supplements, energy drinks, shots and protein bars has built the #1 performance pre-workout supplement in the Vitamins & Supplements Channel. We help millions of elite athletes, gym rats, and fitness enthusiasts get more from their workouts and improve their performance.
We recently launched a line of performance energy drinks, and shots quickly realizing national distribution, and are now one of the best-performing energy drinks in the c-store channel. Our rapidly growing team is aggressive, hungry, and driven to be the best at whatever they do. If you strive for excellence, thrive on competition, and don't settle for #2, you could be a fit for our team.
Master Data Management Specialist Job Responsibilities
:
Responsible for data setup of materials and finished products.
Ensures quality of master data in key systems.
Conducts data cleaning to rid the system of old, unused data, or duplicate data for better management and quicker access.
Work with business units and process experts to resolve master data issues.
Participate in projects and initiatives across multiple functional areas and regions.
Master Data Management Specialist Skills and Qualifications:
Fanatical attention to detail.
Good analytical and problem-solving skills.
Ability to work independently and as part of a team.
Excellent communication and collaboration skills.
Ability to work on multiple projects in various stages simultaneously.
Strong Microsoft Office suite skills, especially with Excel
Able to understand and investigate topics related to multiple business areas.
Analytically minded and methodical problem solver.
Able to efficiently prioritize work and timely inform stakeholders on the progress.
Strong written and communication skills in English language.
Experience with product registration systems such as WERCSmart, GS1, 1WorldSync and Syndigo a strong plus.
Product Manager Education, Experience, & Licensing Requirements
Bachelor's degree in related field.
Experience working with relational databases a plus
Additional Information
Pay is DOE
Full-time schedule
Insurance benefits are available for eligible full-time employees. Benefits include Health Insurance, Dental, basic life Insurance, Vision plan, HSA, and Employee Assistance Program
Additional voluntary benefits include accidental insurance, pet coverage, Metlaw services, and additional life insurance coverage
Paid Holidays
PTO Available for Full-time employees
Employee Discount on Bucked Up products and apparel
ERP Data Migration Consultant
Data analyst job in Lakewood, CO
Oscar is working with a leading ERP Advisory firm that is looking for an experienced ERP Data Migration Consultant to join their team.
As the ERP Data Migration Consultant, you will be responsible for extracting, transforming, and loading legacy data into modern ERP platforms such as NetSuite, Microsoft Dynamics, Acumatica, and others. The ideal candidate is skilled in ETL processes, data mapping, cleansing, and scripting, and is comfortable collaborating directly with clients and cross-functional teams.
Key Responsibilities:
Develop and maintain ETL scripts to extract, transform, and load data between legacy and ERP systems.
Access client legacy systems and convert raw data into structured database formats.
Map source data fields to target ERP data structures.
Cleanse, verify, and validate data using advanced SQL queries to ensure accuracy and quality.
Build SQL stored procedures to convert and prepare legacy data for new ERP environments.
Document and optimize data transformation steps and processes.
Automate data processing tasks using Microsoft SQL Server tools and scripting.
Load validated and transformed data into client ERP systems.
Coordinate with Accounting, Operations, and IT teams to ensure technical processes align with business objectives.
Deliver accurate, high-quality data migration results within project timelines.
Collaborate regularly with the EAG Data Migration team and client stakeholders.
Maintain clear communication with the consulting team to support seamless project execution.
Qualifications:
Bachelor's degree in Business Administration, Information Technology, Computer Information Systems, or a related discipline.
2-4+ years of hands-on experience with SQL Server or MySQL.
Experience with Microsoft Access and application development tools.
Exposure to leading ERP systems such as NetSuite, Microsoft Dynamics, Acumatica, Infor, Epicor, Sage, Oracle, Workday, etc.
Knowledge of business processes in Accounting, Manufacturing, Distribution, or Construction.
Advanced proficiency in Microsoft Office applications (Excel, Word, PowerPoint).
Professional, approachable, and confident communication style.
Recap:
Location: Lakewood, CO (Hybrid)
Type: Full time Permanent
Rate: $80k - $150k annual salary dependent on relevant experience
If you think you're a good fit for the role, we'd love to hear from you!
Business Analyst
Data analyst job in Denver, CO
We are looking for an experienced Business Analyst or Product Owner to support major Contact Center transformation initiatives. The ideal candidate will have strong analytical skills, experience in telecom or customer service environments, and exposure to modern AI-driven engagement solutions.
Mandatory Requirements
Contact Center domain experience.
Strong Business Analyst skills (not a Project Manager role).
Exposure to AI, IVR, SMS, and Chat-based customer engagement solutions.
Telecom industry experience.
Qualifications
5+ years of experience as a Product Owner or Business Analyst in telecom or contact center environments.
Understanding of AI technologies and their application in customer service.
Hands-on experience with omnichannel customer engagement strategies.
Strong communication, documentation, and stakeholder management skills.
Ability to work cross-functionally with business leaders, IT teams, and external vendors.
Strong analytical, critical-thinking, and problem-solving abilities.
Business Analyst
Data analyst job in Boulder, CO
Reports To: Program Manager
The Business Analyst partners closely with the Program Manager and cross-functional teams to guide successful delivery of client and internal initiatives. This role leads requirement discovery, documents clear solution direction, and drives alignment across stakeholders. The BA helps reduce ambiguity, maintains consistent communication, and supports project execution from concept through launch.
Core Competencies
Composure
Customer Focus
Informing
Listening
Organizing
Key Responsibilities
Business Requirements & Discovery
Lead discovery with customers and internal partners to gather, validate, and prioritize requirements.
Facilitate workshops, interviews, shadowing, and surveys to establish root needs and success criteria.
Produce well-structured artifacts including project plans, user stories, workflows, acceptance criteria, and process documentation.
Maintain requirements traceability through delivery and post-launch.
Technology Solution Documentation
Translate business requirements into actionable solution documentation in partnership with delivery teams.
Develop functional specs, system context diagrams, data mappings, workflows, and integration requirements.
Clarify constraints, dependencies, and edge cases for technical teams.
Validate solution proposals against requirements and business goals.
Customer Communication & Support
Serve as a customer-facing communicator for requirements and solution alignment.
Provide timely updates across email, video calls, and messaging platforms (Slack/Teams).
Document meetings, decisions, and action items; distribute clear summaries to stakeholders.
Act as a dependable contact for scope clarification and solution behavior.
Stakeholder Alignment & Project Enablement
Support the Program Manager by identifying risks, gaps, and misalignment early.
Document and manage stakeholder expectations throughout delivery.
Coordinate reviews and approvals of requirements and solution documents.
Assist in change-control by analyzing impacts on requirements, timelines, and outcomes.
Quality, Validation & Continuous Improvement
Ensure requirements and acceptance criteria support effective test planning.
Support UAT by helping customers understand expected outcomes and triaging feedback.
Recommend process and communication enhancements based on recurring gaps.
Maintain reusable templates and documentation standards.
Key Performance Indicators
Quality and completeness of requirements
Stakeholder satisfaction
Contribution to delivery efficiency
Communication clarity and consistency
Qualifications
3+ years as a Business Analyst, Systems Analyst, or similar role supporting technology initiatives.
Demonstrated experience gathering and documenting requirements across diverse stakeholder groups.
Proven ability to translate business needs into functional/technical documentation.
Strong written communication and structured documentation skills.
Proficiency communicating through email, Zoom/Meet, and Slack/Teams.
Familiarity with agile practices and artifacts (stories, epics, acceptance criteria, backlog refinement).
Comfortable with Excel and/or Google Sheets.
High attention to detail and ability to manage multiple concurrent workstreams.
Working knowledge of common SaaS platforms (Salesforce, Workday, etc.), integrations, and data flows is a plus.
Mission-driven mindset and a collaborative, service-oriented approach.
Data Scientist
Data analyst job in Draper, UT
Job Title: Data Scientist
Job-Type: Full-Time
We are seeking a Data Scientist focused on fraud detection and prevention to join a growing fraud detection team. In this role, you will use advanced analytics, machine learning, and statistical modeling to uncover hidden fraud patterns, monitor portfolio health, and design proactive solutions that protect the business, customers, and retail partners. Your work will directly strengthen defenses, reduce fraud losses, and build customer trust.
Duties & Responsibilities:
Develop and deploy fraud detection models and strategies using Python and SQL.
Engineer fraud-specific features (e.g., velocity checks, behavioral profiling, device/IP analysis).
Analyze portfolio trends, monitor fraud risks across customer and merchant segments, and design proactive controls.
Partner with the Fraud Prevention Manager and broader fraud/data science teams to close fraud gaps.
Share insights and recommendations with leadership to influence fraud strategy and decision-making.
Support hybrid rule and machine-learning based fraud prevention platforms (e.g., Kount, CyberSource, Signifyd).
Required Experience & Skills:
Degree in Data Science, Mathematics, Computer Science, Statistics, or related field.
3+ years' experience programming in Python (NumPy, Pandas, Scikit-learn, XGBoost).
2+ years' experience with SQL (Snowflake experience a plus).
Knowledge of fraud typologies, attack vectors, and vulnerabilities.
Understanding of the chargeback dispute/management process.
Strong problem-solving skills with the ability to balance fraud loss, customer experience, and portfolio performance.
Ability to work independently and take ownership of solutions.
Employment Eligibility: Gravity cannot transfer nor sponsor a work visa for this position. Applicants must be eligible to work in the U.S. for any employer directly (we are not open to contract or “corp to corp” agreements).
Bilingual Data Scientist (Spanish)
Data analyst job in Denver, CO
Duration: 12 month contract
***must speak Spanish and English***
Must-haves
0-2 years of experience as a Data Scientist
Proficiency in Spanish
Strong Python coding experience
Familiarity with AI models
Day to Day
Insight Global is seeking a Bilingual Data Scientist/AI Engineer for one of our clients to sit in Denver, CO. This person will be joining a team who is focused on implementing AI for enhancing customer engagement and business usage. This person will work on 5-6 projects at once and be responsible for contributing to designing and developing different AI models for different uses. For example, a current project includes a model serving as a real-time coaching tool for call center agents. This person will spend their time creating models, fine-tuning, leading the team and working closely with the VP and SVP of the group. The current project is implementing new languages for the model to respond to. The team has a daily stand up to start the day and discuss the status of the project. The day would consist of 20% of time in meetings and 80% of time coding using Python to prompt AI models. This role will be performed 5 days a week on-site in Denver, CO.
Data Engineer
Data analyst job in Denver, CO
Data Engineer
Compensation: $ 80 - 90 /hour, depending on experience
Inceed has partnered with a great energy company to help find a skilled Data Engineer to join their team!
Join a dynamic team where you'll be at the forefront of data-driven operations. This role offers the autonomy to design and implement groundbreaking data architectures, working primarily remotely. This position is open due to exciting new projects. You'll be collaborating with data scientists and engineers, making impactful contributions to the company's success.
Key Responsibilities & Duties:
Design and deploy scalable data pipelines and architectures
Collaborate with stakeholders to deliver high-impact data solutions
Integrate data from various sources ensuring consistency and reliability
Develop automation workflows and BI solutions
Mentor others and advise on data process best practices
Explore and implement emerging technologies
Required Qualifications & Experience:
8+ years of data engineering experience
Experience with PI
Experience with SCADA
Experience with Palantir
Experience with large oil and gas datasets
Proficiency in Python and SQL
Hands-on experience in cloud environments (Azure, AWS, GCP)
Nice to Have Skills & Experience:
Familiarity with Apache Kafka or Flink
Perks & Benefits:
3 different medical health insurance plans, dental, and vision insurance
Voluntary and Long-term disability insurance
Paid time off, 401k, and holiday pay
Weekly direct deposit or pay card deposit
If you are interested in learning more about the Data Engineer opportunity, please submit your resume for consideration. Our client is unable to provide sponsorship at this time.
We are Inceed, a staffing direct placement firm who believes in the possibility of something better. Our mission is simple: We're here to help every person, whether client, candidate, or employee, find and secure what's better for them.
Inceed is an equal opportunity employer. Inceed prohibits discrimination and harassment of any type and affords equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity, or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
Life Actuarial Solutions Analyst Senior
Data analyst job in Colorado Springs, CO
Why USAA?
At USAA, our mission is to empower our members to achieve financial security through highly competitive products, exceptional service and trusted advice. We seek to be the #1 choice for the military community and their families.
Embrace a fulfilling career at USAA, where our core values - honesty, integrity, loyalty and service - define how we treat each other and our members. Be part of what truly makes us special and impactful.
The Opportunity
As a dedicated Life Actuarial Solutions Analyst Senior to join the Life Company's Modeling Operations Team. The Life Modeling Operations Team is a diverse team that supports the complex life actuarial modeling ecosystem, which consumes data from multiple sources across USAA to support actuarial functions. Your role also supports Life/Annuity/Health actuarial work through one or more of the following activities: data extraction, data transformation, validation and analysis, and system functionality oversight and integration. Responsible for providing technical and analytical solutions for one or more of the following functions: pricing and product development, experience studies, actuarial assumption reviews, reserve calculations, financial reporting, asset liability management or competitive analysis.
This role is remote eligible in the continental U.S. with occasional business travel. However, individuals residing within a 60-mile radius of a USAA office will be expected to work on-site four days per week.
What you'll do:
Independently extracts, integrates and transforms data from a multitude of sources, and may identify new sources.
Reconciles and validates data accuracy, and reasonability of actuarial or financial information.
Prepares reports, reserve estimates, journal entries, financial statements, industry surveys and/or special studies, analyzes data, and recommends solutions.
Develops comprehensive and innovative solutions that impact productivity to improve actuarial tools and processes.
Resolves unique and complex issues and navigates obstacles to deliver work product.
Develops cost benefit analysis.
Provides insight to management on issues and serves as a resource to team members on escalated issues of an unusual nature.
Leads projects related to actuarial solutions including automation, IT projects, or product development initiatives.
Oversees requirement development process through testing and implementation.
Demonstrates in depth understanding to identify and resolve issues or potential defects.
Maintains processes, procedures and tools, and ensures all regulatory requirements and internal controls are adhered to.
Works with business partners to understand key regulatory implications that impact processes, and may develop processes to comply with new or changing regulations.
May respond to audit requests and oversees coordination of responses to internal and external audit, such as Department of Insurance examination, as well as, other audit reports.
Anticipates and analyzes trends or deviations from forecast, plan or other projections.
Presents recommendations and communicates solutions to business partners and management in a clear, concise, logical and organized manner.
Ensures risks associated with business activities are effectively identified, measured, monitored, and controlled in accordance with risk and compliance policies and procedures.
What you have:
Bachelor's degree; OR 4 years of related experience (in addition to the minimum years of experience required) may be substituted in lieu of degree.
6 or more years of technical experience as an analyst or other relevant technical work experience.
What sets you apart:
Bachelor's degree in mathematics, computer science, statistics, economics, finance, actuarial science, or other similar quantitative field
Experience with SQL or similar programming languages
Experience working in IT for a life insurance company
Experience supporting projects for actuarial or modeling functions
Excellent verbal and written communication skills, with the ability to tailor the content for varying audiences.
Strong aptitude for problem solving and technology
Quick learner, self-starter, and ability to work well autonomously and with others.
US military experience through military service or a military spouse/domestic partner
Compensation range: The salary range for this position is: $93,770 - $168,790.
USAA does not provide visa sponsorship for this role. Please do not apply for this role if at any time (now or in the future) you will need immigration support (i.e., H-1B, TN, STEM OPT Training Plans, etc.).
Compensation: USAA has an effective process for assessing market data and establishing ranges to ensure we remain competitive. You are paid within the salary range based on your experience and market data of the position. The actual salary for this role may vary by location.
Employees may be eligible for pay incentives based on overall corporate and individual performance and at the discretion of the USAA Board of Directors.
The above description reflects the details considered necessary to describe the principal functions of the job and should not be construed as a detailed description of all the work requirements that may be performed in the job.
Benefits: At USAA our employees enjoy best-in-class benefits to support their physical, financial, and emotional wellness. These benefits include comprehensive medical, dental and vision plans, 401(k), pension, life insurance, parental benefits, adoption assistance, paid time off program with paid holidays plus 16 paid volunteer hours, and various wellness programs. Additionally, our career path planning and continuing education assists employees with their professional goals.
For more details on our outstanding benefits, visit our benefits page on USAAjobs.com.
Applications for this position are accepted on an ongoing basis, this posting will remain open until the position is filled. Thus, interested candidates are encouraged to apply the same day they view this posting.
USAA is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Auto-ApplyData Engineer
Data analyst job in Colorado Springs, CO
Our client is seeking a Data Engineer for contract opportunity (with possibility of going permanent). The Data Engineer builds and optimizes the association's data and data pipeline architecture. This includes data flow and collection, ensuring consistent architecture throughout. The incumbent's work is varied, supporting multiple teams, systems, and projects. Staying up-to-date with data engineering tools and technologies, including cloud-based data services, is essential for this position.
DUTIES:
Data Storage
Designs, optimizes, and maintains databases for efficient data storage and retrieval.
Manages data warehouses or data lakes to ensure accessibility and reliability of data.
Develops and maintains data models and schemas that support analytics and reporting.
Manages our Snowflake instance to provide business and regulatory report on our portfolio and ancillary services from initial contact to post loan closure.
Data Architecture
Builds and maintains data pipelines to move, transform, and load data from various sources to a centralized repository.
Optimizes data infrastructure and pipelines for speed, scalability, and cost-effectiveness.
Designs, publishes, documents, monitors, secures, and analyzes Application Programming Interfaces (APIs).
Creates ETL (Extract, Transform, Load) processes to clean, transform, and prepare data for analysis.
Data Quality
Ensures data completeness, integrity, and security through validation, monitoring, and governance practices.
Normalization of data to eliminate duplications and ensure single source of truth.
Data Collaboration
Works closely with stakeholders to understand data needs and provide access to relevant data.
Creates documentation and provides support to help others understand and use the data infrastructure effectively.
Data Security and Confidentiality
Appropriately protects the confidentiality, security, and integrity of the Association, employees, borrowers, and other stakeholders
REQUIRMENTS:
Bachelor's degree in computer science, IT, or related field
5+ years of related experience as a Data Engineer or similar role
2+ years of experience working within Snowflake or an equivalent combination of education and experience sufficient to perform the essential functions of the job.
Technical expertise with data pipelines, API management, data models, and data warehouses
Working knowledge of programming languages (e.g. Java and Python)
Hands-on experience with SQL database design
Demonstrated analytical skills
Demonstrated skill in interacting and collaborating with others
Skill in oral and written communication, sufficient to discuss a variety of job-related topics, and to effectively communicate complex topics to a variety of audiences
Skill in utilizing a systematic approach to problem solving
Skill in researching information to gain knowledge to apply to business challenges
Skill in performing a variety of duties, often changing from one task to another of a different nature
Skill in advising and guiding individuals to achieve results
Kavaliro provides Equal Employment Opportunities to all employees and applicants. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Kavaliro is committed to the full inclusion of all qualified individuals. In keeping with our commitment, Kavaliro will take the steps to assure that people with disabilities are provided reasonable accommodations. Accordingly, if reasonable accommodation is required to fully participate in the job application or interview process, to perform the essential functions of the position, and/or to receive all other benefits and privileges of employment, please respond to this posting to connect with a company representative.
GCP Data engineer
Data analyst job in Lehi, UT
Data Pipeline Development: Design and maintain scalable ETL/ELT pipelines that process time-series signals from thermostats, weather, schedules, and device analytics.
Core Data Products: Build verified HVAC and Energy data tables (e.g., Run & Drift, Thermal Coefficients, Efficiency Drift) to serve as trusted sources for analytics, modeling, and automation.
Modernization & Quality: Refactor legacy Scala/Akka processes into PySpark or Databricks jobs, improving observability, testing, and CI/CD coverage for upstream feeds.
Integration & Streaming: Manage data sourced from Mongo-based telemetry, Kafka or Pub/Sub streams, and cloud storage (GCS) to ensure reliability and consistency.
Model Enablement: Collaborate with data scientists to generate and operationalize features supporting HVAC runtime prediction, anomaly detection, and DR optimization.
Documentation & Governance: Promote best practices for data lineage, schema documentation, and change control to prevent regressions in production systems.
Required Qualifications
3+ years of data engineering or backend data systems experience
Strong proficiency in Python, SQL, and distributed data frameworks (PySpark, Databricks)
Hands-on experience with GCP, Kafka/Pub-Sub, and data lake architecture
Ability to read and modernize legacy Scala/Akka codebases
Proven track record building production-grade pipelines that deliver analytics-ready datasets
Strong problem-solving skills in ambiguous, under-documented environments
Preferred Qualifications
Experience with ML platforms and feature engineering workflows (e.g. Vertex AI)
AI/ML application experience (LLMs, computer vision, energy forecasting models)
Background in IoT applications, protocols, and telemetry
Familiarity with specialized databases:
Graph databases (e.g. Neo4j)
Vector databases (e.g. Pinecone)
Experience with data orchestration tools (e.g. Airflow)
Background in Demand Response or home energy automation
Experience implementing data quality metrics, observability, and alerting
Track record of significant cost optimization or performance improvements
ETL Data Engineer
Data analyst job in Salt Lake City, UT
Role: ETL Data Engineer
Employment Type: Full-time
Experience: 8+ Years
We are seeking an ETL Data Engineer with strong experience in building and supporting large-scale data pipelines. The role involves designing, developing, and optimizing ETL processes using tools like DataStage, SQL, Python, and Spark. You will work closely with architects, engineers, and business teams to create efficient data solutions. The job includes troubleshooting issues, improving performance, and handling data migration and transformation tasks. You will also support Test, QA, and Production environments while ensuring smooth deployments. Strong skills in databases, scripting, and version control are essential for this position.
Responsibilities
Collaborate with architects, engineers, analysts, and business teams to develop and deliver enterprise-level data platforms that support data-driven solutions.
Apply strong analytical, organizational, and problem-solving skills to design and implement technical solutions based on business requirements.
Develop, test, and optimize software components for data platforms, improving performance and efficiency.
Troubleshoot technical issues, identify root causes, and recommend effective solutions.
Work closely with data operations teams to deploy updates into production environments.
Provide support across Test, QA, and Production environments and perform additional tasks as needed.
Required Qualifications
Bachelor's degree in Computer Science, Computer Engineering, or a related discipline.
Strong experience in Data Warehousing, Operational Data Stores, ETL tools, and data management technologies.
8+ years of hands-on expertise in ETL (IBM DataStage), SQL, UNIX/Linux scripting, and Big Data distributed systems.
4+ years of experience with Teradata (Vantage), SQL Server, Greenplum, Hive, and delimited text data sources.
3+ years of experience with Python programming, orchestration tools, and ETL pipeline development using Python/Pandas.
Deep understanding of data migration, data analysis, data transformation, large-volume ETL processing, database modeling, and SQL performance tuning.
Experience creating DDL scripts, stored procedures, and database functions.
Practical experience with Git for version control and release processes.
Familiarity with Spark framework, including RDDs using Python or Scala.
Data Interface Engineer
Data analyst job in Greenwood Village, CO
Are you interested in leading the transformation of cancer care through putting world-leading scientific data and knowledge in the hands of doctors and other members of the medical team? Do you have a passion for solutions that empower patients to take charge of their care and bring world-class solutions to winning the cancer battle? If so, join our growing team, the company that promises to revolutionize the way cancer care is delivered. We are seeking a highly experienced and motivated Data Interface Engineer to join our growing team of expert interface engineers.
We are seeking a highly experienced and motivated Data Interface Engineer to join our growing team of expert interface engineers. In this role, you will be responsible for developing, monitoring, and maintaining data integration pipelines and interfaces to support healthcare data systems. This includes working with industry-standard protocols like HL7, FHIR, and RESTful APIs, and resolving complex data exchange challenges across EMRs and third-party systems. This role is ideal for someone with deep technical knowledge in data integration engines (preferably Mirth or Iguana), scripting, and healthcare interoperability standards.
Key Responsibilities
· Analysis, design, development, and support of data integrations with EMRs and ancillary support systems.
· Leverage your knowledge of JavaScript to build & modernize data pipelines in Mirth for connecting data sources with VieCure.
· Interpret and implement HL7 interface specifications and EMR integration requirements.
· Interpret business rules and requirements for technical systems.
· Design and develop data exchange workflows using HL7 v2/v3, FHIR, JSON/XML, and APIs.
· Troubleshoot and resolve interface issues to maintain system stability and data accuracy.
· Collaborate with internal teams and external partners to resolve technical integration issues.
· Perform data extraction, transformation, and loading (ETL) tasks for integration projects.
· Build and maintain CCD/CCDA and HL7 data mappings and ensure compliance with business rules.
· Monitor and maintain interface health and operational performance.
· Participate in pre- and post-production support for interface validation and deployment.
· Maintain clear technical documentation for development, troubleshooting, and handoff purposes.
Skills & Experience Requirements
Bachelor's Degree in Computer Science, Information Technology or equivalent.
· 5+ years of hands-on experience in data integration, preferably in healthcare IT.
· Expertise with HL7 v2 messages (ADT, ORM, ORU, SIU) and FHIR protocols.
· Experience with Mirth Connect integration engine preferred. Experience with Iguana considered a bonus.
· Strong proficiency in RESTful API development and data exchange logic.
· Working knowledge of Linux/UNIX and Windows environments (file systems, scripting, SFTP/FTP/HTTP).
· Familiarity with EMR data structures, healthcare ontologies, and standard coding schemes.
· Understanding of HIPAA and healthcare data security requirements.
· Knowledge of cloud computing concepts and integration strategies.
· Strong analytical and problem-solving skills with attention to detail.
· Excellent written and verbal communication skills.
· Ability to handle multiple tasks under tight deadlines and resolve conflicts diplomatically.
Preferred Qualifications
Experience in integration across specialties like Laboratory, Oncology and other clinical domains.
· Project management experience in scoping, implementing, and documenting integration solutions.
· Ability to analyze and improve existing data workflows for better efficiency and scalability.
If you're passionate about healthcare technology and ready to play a pivotal role in integrating complex systems with precision and care, we'd love to hear from you!
Data Engineer
Data analyst job in Denver, CO
*** W2 Contract Only - No C2C - No 3rd Parties ***
The Ash Group is hiring a Data Engineer for our client (a specialized financial services subsidiary providing dedicated new home construction financing). This is a Direct Hire role with compensation of $100,000 annually, based in Denver, CO (Hybrid setting).
This role is crucial for transforming the organization into a data-driven environment by designing and optimizing data infrastructures, migrating large-scale data to the Microsoft Azure cloud (specifically Microsoft Fabric), and leveraging expertise in AI/ML to drive decision-making.
Role Details
Compensation: Annual base salary of $100,000. (Eligible for annual bonus based on performance objectives).
Benefits: Comprehensive package including Medical, Dental, and Vision coverage. Eligibility for 401(k) Plan, Company-paid disability/basic life insurance, parental leave, tuition reimbursement, and generous PTO (up to 17 days/year for less than 10 years of service).
Duration: Direct Hire.
Location: Hybrid in Denver, CO. (Requires 1 day per week in office).
What You'll Be Doing
Design new and migrate existing large-scale data stores (from on-premises SQL Server) to the modern Microsoft Fabric-based infrastructure, including the Lakehouse and data warehouses.
Develop, code, and optimize ETL/ELT solutions and data pipelines using SQL, Python, and PySpark, focusing on data acquisition and quality.
Collaborate with data scientists to productionize ML models and integrate them seamlessly into data pipelines to deliver business impact.
Utilize and optimize modern data engineering tools like Azure Data Factory, Synapse, and Jupyter Notebooks for processing and analysis.
Provide technical expertise during the full development lifecycle, ensuring adherence to data architecture and enterprise quality standards.
What We're Looking For
4+ years' software engineering experience with Python, PySpark, Spark, or equivalent notebook programming.
3+ years' experience with SQL, relational databases, and large data repositories, including advanced knowledge of writing SQL and optimizing query plans.
Hands-on experience with Azure Data Factory, Azure Synapse, Data Lake, and the Microsoft Fabric environment (or strong willingness to adopt new cloud-native data platforms).
Knowledge of AI/ML, Agents, and other automation tools, including experience with ML frameworks (e.g., scikit-learn, TensorFlow) is highly preferred.
Experience with CI/CD concepts, DataOps/MLOps, and general software deployment lifecycles.
Participant in Agile methodologies (Scrum) with strong verbal and written communication skills to effectively collaborate with technical and non-technical stakeholders.
Apply today to join a dynamic team supporting critical infrastructure projects.
#DataEngineer #AzureCloud #DataOps #AIML #DirectHire #DenverJobs #PySpark
RevOps Analyst
Data analyst job in Heber, UT
Job Title: RevOps Analyst
Employment Type: Full-Time
Mission: Why we exist, What we do and Why we need you
RevBlack exists to revolutionize revenue operations, turning CRM systems into engines of growth and efficiency. We specialize in optimizing Marketing Ops, Sales Ops, and RevOps using tools like HubSpot and Salesforce to deliver scalable, data-driven solutions for our clients.
RevBlack is not for everyone. It's for the curious, fast learners who take ownership and thrive on high standards. We work hard, move fast, and deliver work that makes a difference. If you're seeking an easy job, this isn't it. But if you're ready to push your limits and grow, we want you.
We need a RevOps Analyst to drive client success by implementing CRM solutions, optimizing processes, and providing insights that fuel revenue growth. You will apply your technical and analytical skills in the fast-paced world of SaaS and B2B operations. You'll become specialists in Salesforce and HubSpot to streamline workflows, support data-driven decisions, and drive both operational efficiency and client success.
What you'll own
Drive Client Revenue Growth: Implement and manage CRM solutions that lead to measurable revenue increases.
Optimize Operational Processes: Streamline revenue operations to enhance efficiency and reduce costs.
Maintain Data Integrity: Ensure CRM data accuracy for informed decision-making.
Provide Strategic Recommendations: Analyze data to offer insights that boost client success.
What we're looking for.
CRM Experience: Experience or strong interest in RevOps, Sales Operations, Marketing Operations, CRM Administration, or a related professional services delivery role.
Operational Expertise: Deep knowledge of sales, marketing, and customer success operations.
Analytical Prowess: Strong skills in data analysis and process optimization.
Business Knowledge: Strong business acumen with an understanding of the SaaS business model and B2B environments.
Adaptability: Thrive in a fast-paced, dynamic environment, and a willingness to deepen technical knowledge.
Why Join RevBlack?
Be part of a dynamic and innovative team that delivers high-quality, impactful work tailored to our clients' needs.
Enjoy a flexible work environment with a hybrid modality, competitive salary, and benefits like unlimited PTO.
Take advantage of significant opportunities for career growth, with a strong focus on continuous learning and skill development.
Work in a transparent and communicative culture where clarity and collaboration are key to success.
If you're ready to make a difference in the RevOps world, apply now to join RevBlack.
GCP Data Engineer
Data analyst job in Lone Tree, CO
We are looking for an experienced Data Engineer to build and maintain scalable data pipelines on Google Cloud Platform (GCP). In this role, you will be crucial in serving our Cyber Security data mart and supporting security analytics.
Must Have:
• Bachelor's or master's degree in computer science, Information Systems, Engineering, or related field.
• 5+ years of hands-on experience with data management in gathering data from multiple sources and consolidating them into a single centralized location. Transforming the data with business logic in a consumable manner for visualization and data analysis.
• Strong expertise in Google BigQuery, Google Cloud Storage, Cloud Composer, and related Google Cloud Platform (GCP) services.
• Proficiency in Python and SQL for data processing and automation.
• Experience with ETL processes and data pipeline design.
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration.
What you're good at
• Design, build, and maintain scalable data pipelines using Google Cloud Platform tools such as BigQuery, Cloud Storage, and Cloud Composer.
• Develop and optimize SQL queries to support data extraction, transformation, and loading (ETL) processes.
• Collaborate with cross-functional teams, including business customers and Subject Matter Experts, to understand data requirements and deliver effective solutions.
• Implement best practices for data quality, data governance, and data security.
• Monitor and troubleshoot data pipeline issues, ensuring high availability and performance.
• Contribute to data architecture decisions to provide recommendations for improving the data pipeline.
• Stay up to date with emerging trends and technologies in cloud-based data engineering and cybersecurity.
• Exceptional communication skills, including the ability to gather relevant data and information, actively listen, dialogue freely, and verbalize ideas effectively.
• Ability to work in an Agile work environment to deliver incremental value to customers by managing and prioritizing tasks.
• Proactively lead investigation and resolution efforts when data issues are identified taking ownership to resolve them in a timely manner.
• Ability to interoperate and document processes and procedures for producing metrics.
Data Analyst Intern, application via RippleMatch
Data analyst job in Denver, CO
This role is with RippleMatch's partner companies. RippleMatch partners with hundreds of companies looking to hire top talent.
About RippleMatch
RippleMatch is your AI-powered job matchmaker. Our platform brings opportunities directly to you by matching you with top employers and jobs you are qualified for. Tell us about your strengths and goals - we'll get you interviews! Leading employers leverage RippleMatch to build high-performing teams and Gen Z job seekers across the country trust RippleMatch to launch and grow their careers.
Requirements for the role:
Currently pursuing a Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Economics, or a related field.
Strong foundational knowledge in statistical analysis, data modeling, and data mining techniques.
Proficiency in data analysis tools and programming languages such as Python, R, SQL, or similar.
Experience with data visualization tools and software (e.g., Tableau, Power BI, or similar).
Ability to interpret complex data sets and provide actionable insights.
Excellent problem-solving skills and attention to detail.
Effective organizational and time management skills, with the ability to prioritize tasks and manage multiple projects simultaneously.
Strong communication and interpersonal skills, with the ability to collaborate effectively with team members.
Eagerness to learn and apply new techniques and tools in the field of data analysis.
Auto-Apply2022 Summer Intern: Business Analyst
Data analyst job in Greenwood Village, CO
Spectrum
Job DescriptionAt a Glance
You're a motivated rising junior or above student with a 3.0 GPA or higher seeking a degree in one of the following areas listed below from an accredited college or university:
Business Analytics
Data Analytics
Strategic Planning
This is a learning-intensive program designed to give you essential business insights and hands-on experience in your field of choice. It's a full-time, 10-week commitment from June 1, 2022 through August 5, 2022.
Benefits include professional development sessions, networking opportunities, and mentorship.
The Spectrum Internship Experience
You have clear aspirations and are seeking a summer internship program that will help you meet them. Find it at Spectrum, named one of the Top 100 Internship Programs in the United States by WayUp.
Our internships are designed to provide:
Opportunities to gain new skills and elevate the ones you already have, all in a robust and forward-thinking business setting.
First-rate, hands-on experience in the telecommunications industry.
Opportunities to connect you with people who can give you a better understanding of the industry and help you accomplish real goals you can add to your résumé, this includes assigning you a formal mentor and interactions with senior executives.
What you can expect in this role
As a Spectrum Intern, you'll be essential to two teams - your respective department and your Intern peer group. Department and team-focused projects account for about 80% of your schedule. You'll spend the other 20% on professional development sessions and networking activities, including the Kickoff Conference on June 2, webinars, community service, cross-functional project, and final presentations.
Internship responsibilities may include
Gather, analyze, refine, validate, document and maintain complex L&D data for various reporting needs
Oversee the import and export of data from all L&D data sources used for departmental reporting to insure data integrity is maintained, including generation and extraction of custom data reports
Create scripted automations for data extracts and to notify recipients of updated reports via email or SharePoint uploads
Create dashboards and reports using data to tell a story,
Participate in conference calls with learning leaders across the organization
Present data and findings to learning leaders
Support the organization with adhoc or critical data needs as they arise
Being flexible to the changing needs of the organization while working efficiently to meet deadlines.
Here's what it takes to get started
Required qualifications
Must be currently enrolled in an accredited College or University completing a Bachelor's Degree or Advanced Degree
Data Analyst Summer Intern
Data analyst job in Englewood, CO
Company Details
At Verus Specialty Insurance, a proud member of the esteemed W.R. Berkley Corporation (NYSE: WRB), we stand as a leading Excess and Surplus Lines provider delivering comprehensive solutions across the United States. Backed by the formidable strength of a Fortune 500 titan and operating with the agility of a nimble startup, we blend the best of both worlds to foster innovation and excellence in everything we do.
Our nationwide operations are supported by a robust network of select wholesale producers, ensuring that our reach and capabilities are always close at hand. We are driven by a forward-thinking leadership that champions a dynamic culture where questioning the norm is not just welcomed but expected. This ethos empowers our team to consistently surpass customer expectations and drive the industry forward.
At Verus, we are more than just a company; we are a community that thrives on collaboration, growth, and taking ownership of our actions. We are constantly on the lookout for exceptional talent who are eager to contribute, innovate, and grow with us. If you are passionate about making a mark in the insurance industry and align with our vision, we eagerly await your application. Join us and be a part of a team where your contributions are valued, and your potential fully realized!
The Company is an equal employment opportunity employer.
*************************
Responsibilities
Come join the Verus team as a summer intern! Our interns work with many different people to learn and gain hands on experience in reporting work and data. This is a great opportunity for anyone interested in starting their career. We will be hiring an intern to learn and grow in a cohort environment.
Interns are responsible for performing data analyst related support duties while gaining valuable real-world experience which can be utilized both personally and professionally. Resume builder and steppingstone to a future career in the Insurance Industry.
Assist with developing SQL queries needed for reports or analysis.
Analyze internal and external data.
Assist with projects.
Build or revise Power BI reports that are used throughout the organization.
Qualifications
Current college student (and/or recent graduate) working towards a degree program in a STEM related major with a 3.0 GPA or higher.
Must demonstrate excellent oral and written communication skills.
Must be willing to work collaboratively and embrace innovative ideas and processes.
Must be technology-focused and proficient in the use of a computer and its applications.
Proficiency with Power BI and SQL preferred.
#LI-FL1 #LI-INTERNSHIP
Additional Company Details We do not accept any unsolicited resumes from external recruiting firms.
This is a 10-week, paid internship - Monday, June 1, 2026 - Thursday, August 6, 2026.
The hourly rate range based on a 37.5-hour work week is $20.00 - $22.00 an hour.
This role does not offer a benefits package, as it is a temporary, summer internship position. Additional Requirements
The application window for this role is estimated to be open through November 28, 2025, but may be extended, if necessary. Please submit your application as soon as possible prior to November 28, 2025. Sponsorship Details Sponsorship not Offered for this Role
Auto-ApplySenior Forest Analyst
Data analyst job in Grand Junction, CO
Apply now Senior Forest Analyst At TÜV SÜD we are passionate about technology. Innovations impact our daily lives in countless ways, and we are dedicated to being a part of that progress. We test, we audit, we inspect, we advise. We never stop challenging ourselves for the safety of society and its people. We breathe technology, we strive for professional excellence, and we leave a mark. We take the future into our hands. We are TÜV SÜD.
Your Tasks
* Conduct verification, validation, confirmation, and related audit activities for forest carbon projects across programs such as the California Air Resources Board, Climate Action Reserve, Climate Forward, Verified Carbon Standard, American Carbon Registry, CCB Standards, and SD VISta.
* Perform on-site fieldwork including forest inventory audits, mensuration, check-cruising, boundary verification, harvest and silviculture assessments, and stakeholder interviews.
* Review and audit carbon quantification data, growth and yield modeling, and project documentation for accuracy and protocol compliance.
* Use modeling tools such as FVS, CBM-CFS3, Remsoft Woodstock, and other approved systems to evaluate project modeling and quantification.
* Conduct GIS analysis, cartography, spatial modeling, and mobile or online GIS field data collection to support verification and reporting.
* Prepare verification and validation reports in alignment with registry requirements.
* Provide training, guidance, and quality review for Forest Analysts and contribute to internal training materials, templates, and process improvements.
* Support timberland management work including inventory design, field data collection, appraisals, spatial analysis, and reporting.
* Coordinate with internal teams and supervisors on scheduling, resource allocation, and technical quality standards.
* Represent TÜV SÜD professionally with clients, agencies, and stakeholders, and maintain strong relationships across the forestry and carbon community.
Your Qualifications
* B.S./B.A. in Forestry or a closely related field.
* Minimum 5 years of forestry or closely related experience.
* Minimum 2 years of experience in forest carbon project development, verification or validation, registry or regulatory oversight, or related experience.
* High proficiency in forest inventory measurement tools, sampling protocols, and mensuration techniques.
* High proficiency with ESRI GIS software and mobile or online GIS platforms.
* High proficiency with Microsoft Excel and experience with database tools such as Access and R.
* Experience with forest carbon modeling software including FVS, CBM-CFS3, Remsoft Woodstock, or comparable tools.
* Ability to work safely and effectively in steep, rugged, remote terrain and in adverse weather conditions.
* Ability to navigate using GPS, maps, and compass.
* Strong written and oral communication skills.
* Valid driver's license with a clear driving record.
* Ability to obtain a state Professional Forester or SAF Certified Forester credential within one year.
* Ability to obtain required verifier credentials within one year, including Climate Action Reserve, Climate Forward, ACR, ARB Accredited Offset Verifier, and US Forest Projects Specialist.
* Ability to manage multiple complex tasks, maintain confidentiality, and produce accurate, high-quality work.
What We Offer
* Opportunity to contribute to leading forest carbon verification and sustainability initiatives.
* Global collaboration and exposure to diverse project types and international work.
* Professional development, including verifier credentialing and forestry certifications.
* Supportive environment focused on safety, integrity, and continuous learning.
Additional Information
* The anticipated annual base pay range for this full-time position is $90,000 - $120,000. Actual base pay will be determined based on various factors, including years of relevant experience, training, qualifications, and internal equity. The compensation package may also include an annual bonus target, subject to eligibility and other requirements. Additionally, we offer a comprehensive benefits package to employees, including a 401(k) plan with employer match, up to 12 weeks of paid parental leave for birthing parents and 2 weeks for other parents, health plans (medical, dental, and vision), life insurance and disability, and generous paid time off.
* Remote role with required travel to remote project locations in the US, Mexico, Canada, and occasional international travel.
* Fieldwork may include travel using helicopters, float planes, ATVs, fan boats, snowmobiles, and other terrain-access vehicles.
* Work may involve exposure to wildlife, rugged terrain, extreme weather, pollen, dust, smoke, or pesticides.
* Requires the ability to lift and move up to 50 pounds.
* Adherence to all TÜV SÜD policies related to safety, confidentiality, compliance, and professional conduct is required.
Equal Opportunity Employer - Disability and Veteran
TÜV SÜD America, Inc. is an equal opportunity, affirmative action employer and considers qualified applicants for employment without regard to race, color, creed, religion, ancestry, marital status, genetics, national origin, sex, sexual orientation, gender identity and expression, age, physical or mental disability, veteran status and those laws, directives, and regulations of Federal, State, and Local governing bodies or agencies. We participate in the E-Verify Employment Verification Program.
Data Migration Specialist (Customer Support)
Data analyst job in Denver, CO
Why Housecall Pro?
Help us build solutions that build better lives. At Housecall Pro, we show up to work every day to make a difference for real people: the home service professionals that support America's 100 million homes.
We're all about the Pro, and dedicate our days to helping them streamline operations, scale their businesses, and-ultimately-save time so they can be with their families and live well. We care deeply about our customers and foster a culture where our company, employees, and Pros grow and succeed together. Leadership is as focused on growing team members' careers as they expect their teams to be on creating solutions for Pros.
Role Overview:
As a Specialist, Data Operations at Housecall Pro, you're a meticulous data steward, ensuring the precision and completeness of our data. You are self motivated, with the ability to work autonomously. You're adept at identifying and resolving data anomalies, diving deep to tackle root causes. Your thirst for learning and commitment to accuracy make you an invaluable asset to our data operations team.
Our team is patient, empathetic, hard working, and above all else focused on improving the lives of our service professionals (our Pros). Our success is their success.
What you'll be each day:
Analyze source and quality of data, identify potential issues and develop custom data migration action plan
Resolve data migration issues and provide technical support for the data migration process
Communicate consistent trends and opportunities to our product/engineering team for future improvements
Create and maintain internal and external process documentation
Communicate client information, trends and feedback cross-functionally
Innovate on current processes and proactively seek ways to improve the Pro experience
Qualifications:
Bachelor's degree preferred
2-4 years of full-time customer success, implementation, engineering or data implementation experience
Intermediate knowledge and experience with Microsoft Office Suite with proficiency in Excel or Google Sheets
Experience with Python a plus
Experience using or developing with conversational AI platforms (such as -ChatGPT, GPT-based tools, or other NLP models) a plus
Demonstrated experience exceeding customer success or sales metrics
Proven success working with cross-functional teams and building strong relationships internally and externally
What will help you succeed:
Meticulous attention to detail
Excellent written/verbal communication skills
Strong critical thinking and problem-solving skills
Adaptability, drive, and a self-starting attitude
Ability to excel in a fast-paced, team environment
Founded in 2013, Housecall Pro helps home service professionals (Pros) streamline every aspect of their business. With easy-to-use tools for scheduling, dispatching, payments, and more, Housecall Pro enables Pros to save time, grow profitably, and provide best-in-class service.
Housecall Pro's brand portfolio includes Business Coaching by Housecall Pro, a business coaching solution for home services businesses. Our brands are united by a singular mission to champion our Pros to success.
We support more than 40,000 businesses and have over 1,800 ambitious, mission-driven, genuinely fun-loving employees across the United States and all over the world. If you want to do work that impacts real people, supported by a team that will invest in you every step of the way, we'd love to hear from you.
Housecall Pro celebrates diversity and we are committed to creating an inclusive environment. We are an equal opportunity employer and do not discriminate on the basis of gender, race, religion, national origin, ethnicity, disability, gender identity/expression, sexual orientation, veteran or military status, or any other category protected under the law. #LI-remote
Location Dependent information
This role is open to candidates and the expected compensation range for this role is
$21.55-$25.35 / hour + 10% variable.
The specific hourly rate for the successful candidate will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location. This role is also eligible to participate in Housecall Pro's the following benefits: health care insurance (medical, dental, vision, disability), employee assistance program, 401(K), flexible time off, paid parental leave, tech reimbursement, and other company benefits. Housecall Pro is growing fast and we're scaling our team to help enable and accelerate our growth.
Privacy Notice for California Job Candidates - Housecall Pro
#LI-Remote
Auto-Apply