ERP Data Migration Consultant
Data engineer job in Lakewood, CO
Oscar is working with a leading ERP Advisory firm that is looking for an experienced ERP Data Migration Consultant to join their team.
As the ERP Data Migration Consultant, you will be responsible for extracting, transforming, and loading legacy data into modern ERP platforms such as NetSuite, Microsoft Dynamics, Acumatica, and others. The ideal candidate is skilled in ETL processes, data mapping, cleansing, and scripting, and is comfortable collaborating directly with clients and cross-functional teams.
Key Responsibilities:
Develop and maintain ETL scripts to extract, transform, and load data between legacy and ERP systems.
Access client legacy systems and convert raw data into structured database formats.
Map source data fields to target ERP data structures.
Cleanse, verify, and validate data using advanced SQL queries to ensure accuracy and quality.
Build SQL stored procedures to convert and prepare legacy data for new ERP environments.
Document and optimize data transformation steps and processes.
Automate data processing tasks using Microsoft SQL Server tools and scripting.
Load validated and transformed data into client ERP systems.
Coordinate with Accounting, Operations, and IT teams to ensure technical processes align with business objectives.
Deliver accurate, high-quality data migration results within project timelines.
Collaborate regularly with the EAG Data Migration team and client stakeholders.
Maintain clear communication with the consulting team to support seamless project execution.
Qualifications:
Bachelor's degree in Business Administration, Information Technology, Computer Information Systems, or a related discipline.
2-4+ years of hands-on experience with SQL Server or MySQL.
Experience with Microsoft Access and application development tools.
Exposure to leading ERP systems such as NetSuite, Microsoft Dynamics, Acumatica, Infor, Epicor, Sage, Oracle, Workday, etc.
Knowledge of business processes in Accounting, Manufacturing, Distribution, or Construction.
Advanced proficiency in Microsoft Office applications (Excel, Word, PowerPoint).
Professional, approachable, and confident communication style.
Recap:
Location: Lakewood, CO (Hybrid)
Type: Full time Permanent
Rate: $80k - $150k annual salary dependent on relevant experience
If you think you're a good fit for the role, we'd love to hear from you!
Data Engineer
Data engineer job in Colorado Springs, CO
Our client is seeking a Data Engineer for contract opportunity (with possibility of going permanent). The Data Engineer builds and optimizes the association's data and data pipeline architecture. This includes data flow and collection, ensuring consistent architecture throughout. The incumbent's work is varied, supporting multiple teams, systems, and projects. Staying up-to-date with data engineering tools and technologies, including cloud-based data services, is essential for this position.
DUTIES:
Data Storage
Designs, optimizes, and maintains databases for efficient data storage and retrieval.
Manages data warehouses or data lakes to ensure accessibility and reliability of data.
Develops and maintains data models and schemas that support analytics and reporting.
Manages our Snowflake instance to provide business and regulatory report on our portfolio and ancillary services from initial contact to post loan closure.
Data Architecture
Builds and maintains data pipelines to move, transform, and load data from various sources to a centralized repository.
Optimizes data infrastructure and pipelines for speed, scalability, and cost-effectiveness.
Designs, publishes, documents, monitors, secures, and analyzes Application Programming Interfaces (APIs).
Creates ETL (Extract, Transform, Load) processes to clean, transform, and prepare data for analysis.
Data Quality
Ensures data completeness, integrity, and security through validation, monitoring, and governance practices.
Normalization of data to eliminate duplications and ensure single source of truth.
Data Collaboration
Works closely with stakeholders to understand data needs and provide access to relevant data.
Creates documentation and provides support to help others understand and use the data infrastructure effectively.
Data Security and Confidentiality
Appropriately protects the confidentiality, security, and integrity of the Association, employees, borrowers, and other stakeholders
REQUIRMENTS:
Bachelor's degree in computer science, IT, or related field
5+ years of related experience as a Data Engineer or similar role
2+ years of experience working within Snowflake or an equivalent combination of education and experience sufficient to perform the essential functions of the job.
Technical expertise with data pipelines, API management, data models, and data warehouses
Working knowledge of programming languages (e.g. Java and Python)
Hands-on experience with SQL database design
Demonstrated analytical skills
Demonstrated skill in interacting and collaborating with others
Skill in oral and written communication, sufficient to discuss a variety of job-related topics, and to effectively communicate complex topics to a variety of audiences
Skill in utilizing a systematic approach to problem solving
Skill in researching information to gain knowledge to apply to business challenges
Skill in performing a variety of duties, often changing from one task to another of a different nature
Skill in advising and guiding individuals to achieve results
Kavaliro provides Equal Employment Opportunities to all employees and applicants. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Kavaliro is committed to the full inclusion of all qualified individuals. In keeping with our commitment, Kavaliro will take the steps to assure that people with disabilities are provided reasonable accommodations. Accordingly, if reasonable accommodation is required to fully participate in the job application or interview process, to perform the essential functions of the position, and/or to receive all other benefits and privileges of employment, please respond to this posting to connect with a company representative.
Data Engineer
Data engineer job in Denver, CO
Data Engineer
Compensation: $80 - $90/hour, depending on experience
Inceed has partnered with a great company to help find a skilled Data Engineer to join their team!
Join a dynamic team as a contract Data Engineer, where you'll be the backbone of data-driven operations. This role offers the opportunity to work with a modern tech stack in a hybrid on-prem and cloud environment. You'll design and implement innovative solutions to complex challenges, collaborating with data scientists, location intelligence experts, and ML engineers. This exciting opportunity has opened due to a new project initiative and you'll be making a tangible impact.
Key Responsibilities & Duties:
Design and deploy scalable data pipelines and architectures
Collaborate with stakeholders to deliver high-impact data solutions
Integrate data from multiple sources ensuring quality and reliability
Develop automation workflows and BI solutions
Mentor others and contribute to the knowledge base
Explore and implement emerging technologies
Required Qualifications & Experience:
8+ years of experience in data engineering
Experience with large oil and gas datasets
Proficiency in SQL and Python
Hands-on experience in cloud environments (Azure, AWS, or GCP)
Familiarity with Apache Kafka, Apache Flink, or Azure Event Hubs
Nice to Have Skills & Experience:
Experience with Palantir Foundry
Knowledge of query federation platforms
Experience with modern data stack tools like dbt or Airflow
Perks & Benefits:
3 different medical health insurance plans, dental, and vision insurance
Voluntary and Long-term disability insurance
Paid time off, 401k, and holiday pay
Weekly direct deposit or pay card deposit
If you are interested in learning more about the Data Engineer opportunity, please submit your resume for consideration. Our client is unable to provide sponsorship at this time.
We are Inceed, a staffing direct placement firm who believes in the possibility of something better. Our mission is simple: We're here to help every person, whether client, candidate, or employee, find and secure what's better for them.
Inceed is an equal opportunity employer. Inceed prohibits discrimination and harassment of any type and affords equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity, or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
Data Governance Architect
Data engineer job in Denver, CO
Job Title: Data Governance Architect
Duration: Long Term
Rate: $105/hr. on 1099
Responsibilities:
Minimum of 15 years of experience in technology consulting or advisory roles, with at least 10 years in a Senior Solution Architect, Lead Architect or similar senior architectural role.
Strong understanding of solution architecture, system integration, and IT security concepts.
Strong background in designing and implementing secure, scalable and high-performance cloud-based solutions.
In-depth knowledge of relevant state and federal regulations affecting government technology systems
Familiarity with technology governance frameworks and project management methodologies
Experience with data warehousing, big data and analytics platforms.
Experience working with government agencies is highly desirable
Soft Skills
The individual in this position must be an effective communicator, capable of managing necessary work and meeting objectives autonomously.
Excellent analytical and problem-solving skills.
Outstanding communication and interpersonal skills.
Ability to work independently and as part of a distributed team.
Excellent presentation and documentation skills.
Ability to manage multiple priorities and deadlines effectively.
Technical Skills
Experience with multiple programming languages (e.g., .Net, Python, Java, Node.js)
Expertise in database technologies to include relational databases and NoSQL
Proficient in the development and maintenance of CI/CD pipelines utilizing multiple tools
In-depth knowledge of AWS and GCP environments
Data Scientist
Data engineer job in Draper, UT
Job Title: Data Scientist
Job-Type: Full-Time
We are seeking a Data Scientist focused on fraud detection and prevention to join a growing fraud detection team. In this role, you will use advanced analytics, machine learning, and statistical modeling to uncover hidden fraud patterns, monitor portfolio health, and design proactive solutions that protect the business, customers, and retail partners. Your work will directly strengthen defenses, reduce fraud losses, and build customer trust.
Duties & Responsibilities:
Develop and deploy fraud detection models and strategies using Python and SQL.
Engineer fraud-specific features (e.g., velocity checks, behavioral profiling, device/IP analysis).
Analyze portfolio trends, monitor fraud risks across customer and merchant segments, and design proactive controls.
Partner with the Fraud Prevention Manager and broader fraud/data science teams to close fraud gaps.
Share insights and recommendations with leadership to influence fraud strategy and decision-making.
Support hybrid rule and machine-learning based fraud prevention platforms (e.g., Kount, CyberSource, Signifyd).
Required Experience & Skills:
Degree in Data Science, Mathematics, Computer Science, Statistics, or related field.
3+ years' experience programming in Python (NumPy, Pandas, Scikit-learn, XGBoost).
2+ years' experience with SQL (Snowflake experience a plus).
Knowledge of fraud typologies, attack vectors, and vulnerabilities.
Understanding of the chargeback dispute/management process.
Strong problem-solving skills with the ability to balance fraud loss, customer experience, and portfolio performance.
Ability to work independently and take ownership of solutions.
Employment Eligibility: Gravity cannot transfer nor sponsor a work visa for this position. Applicants must be eligible to work in the U.S. for any employer directly (we are not open to contract or “corp to corp” agreements).
ETL Data Engineer
Data engineer job in Salt Lake City, UT
Role: ETL Data Engineer
Employment Type: Full-time
Experience: 8+ Years
We are seeking an ETL Data Engineer with strong experience in building and supporting large-scale data pipelines. The role involves designing, developing, and optimizing ETL processes using tools like DataStage, SQL, Python, and Spark. You will work closely with architects, engineers, and business teams to create efficient data solutions. The job includes troubleshooting issues, improving performance, and handling data migration and transformation tasks. You will also support Test, QA, and Production environments while ensuring smooth deployments. Strong skills in databases, scripting, and version control are essential for this position.
Responsibilities
Collaborate with architects, engineers, analysts, and business teams to develop and deliver enterprise-level data platforms that support data-driven solutions.
Apply strong analytical, organizational, and problem-solving skills to design and implement technical solutions based on business requirements.
Develop, test, and optimize software components for data platforms, improving performance and efficiency.
Troubleshoot technical issues, identify root causes, and recommend effective solutions.
Work closely with data operations teams to deploy updates into production environments.
Provide support across Test, QA, and Production environments and perform additional tasks as needed.
Required Qualifications
Bachelor's degree in Computer Science, Computer Engineering, or a related discipline.
Strong experience in Data Warehousing, Operational Data Stores, ETL tools, and data management technologies.
8+ years of hands-on expertise in ETL (IBM DataStage), SQL, UNIX/Linux scripting, and Big Data distributed systems.
4+ years of experience with Teradata (Vantage), SQL Server, Greenplum, Hive, and delimited text data sources.
3+ years of experience with Python programming, orchestration tools, and ETL pipeline development using Python/Pandas.
Deep understanding of data migration, data analysis, data transformation, large-volume ETL processing, database modeling, and SQL performance tuning.
Experience creating DDL scripts, stored procedures, and database functions.
Practical experience with Git for version control and release processes.
Familiarity with Spark framework, including RDDs using Python or Scala.
Bilingual Data Scientist (Spanish)
Data engineer job in Denver, CO
Duration: 12 month contract
***must speak Spanish and English***
Must-haves
0-2 years of experience as a Data Scientist
Proficiency in Spanish
Strong Python coding experience
Familiarity with AI models
Day to Day
Insight Global is seeking a Bilingual Data Scientist/AI Engineer for one of our clients to sit in Denver, CO. This person will be joining a team who is focused on implementing AI for enhancing customer engagement and business usage. This person will work on 5-6 projects at once and be responsible for contributing to designing and developing different AI models for different uses. For example, a current project includes a model serving as a real-time coaching tool for call center agents. This person will spend their time creating models, fine-tuning, leading the team and working closely with the VP and SVP of the group. The current project is implementing new languages for the model to respond to. The team has a daily stand up to start the day and discuss the status of the project. The day would consist of 20% of time in meetings and 80% of time coding using Python to prompt AI models. This role will be performed 5 days a week on-site in Denver, CO.
Data Engineer
Data engineer job in Denver, CO
*** W2 Contract Only - No C2C - No 3rd Parties ***
The Ash Group is hiring a Data Engineer for our client (a specialized financial services subsidiary providing dedicated new home construction financing). This is a Direct Hire role with compensation of $100,000 annually, based in Denver, CO (Hybrid setting).
This role is crucial for transforming the organization into a data-driven environment by designing and optimizing data infrastructures, migrating large-scale data to the Microsoft Azure cloud (specifically Microsoft Fabric), and leveraging expertise in AI/ML to drive decision-making.
Role Details
Compensation: Annual base salary of $100,000. (Eligible for annual bonus based on performance objectives).
Benefits: Comprehensive package including Medical, Dental, and Vision coverage. Eligibility for 401(k) Plan, Company-paid disability/basic life insurance, parental leave, tuition reimbursement, and generous PTO (up to 17 days/year for less than 10 years of service).
Duration: Direct Hire.
Location: Hybrid in Denver, CO. (Requires 1 day per week in office).
What You'll Be Doing
Design new and migrate existing large-scale data stores (from on-premises SQL Server) to the modern Microsoft Fabric-based infrastructure, including the Lakehouse and data warehouses.
Develop, code, and optimize ETL/ELT solutions and data pipelines using SQL, Python, and PySpark, focusing on data acquisition and quality.
Collaborate with data scientists to productionize ML models and integrate them seamlessly into data pipelines to deliver business impact.
Utilize and optimize modern data engineering tools like Azure Data Factory, Synapse, and Jupyter Notebooks for processing and analysis.
Provide technical expertise during the full development lifecycle, ensuring adherence to data architecture and enterprise quality standards.
What We're Looking For
4+ years' software engineering experience with Python, PySpark, Spark, or equivalent notebook programming.
3+ years' experience with SQL, relational databases, and large data repositories, including advanced knowledge of writing SQL and optimizing query plans.
Hands-on experience with Azure Data Factory, Azure Synapse, Data Lake, and the Microsoft Fabric environment (or strong willingness to adopt new cloud-native data platforms).
Knowledge of AI/ML, Agents, and other automation tools, including experience with ML frameworks (e.g., scikit-learn, TensorFlow) is highly preferred.
Experience with CI/CD concepts, DataOps/MLOps, and general software deployment lifecycles.
Participant in Agile methodologies (Scrum) with strong verbal and written communication skills to effectively collaborate with technical and non-technical stakeholders.
Apply today to join a dynamic team supporting critical infrastructure projects.
#DataEngineer #AzureCloud #DataOps #AIML #DirectHire #DenverJobs #PySpark
Data Engineer
Data engineer job in Denver, CO
We're seeking a highly skilled Data Engineer to join a high-performing organization that thrives on innovation and precision. You'll design, build, and optimize the robust data infrastructure that powers our investment insights, ensuring we can make smarter, faster decisions at scale.
What You'll Do
Design and optimize scalable, high-volume data pipelines that fuel business-critical analytics.
Orchestrate workflows using tools like Airflow or Dagster for maximum efficiency and reliability.
Partner with cross-functional teams to architect innovative, enterprise-grade data solutions.
Tackle complex, imperfect systems with creative, solution-driven approaches.
What We're Looking For
Bachelor's degree in Computer Science or related field (advanced degree a plus).
3+ years in data engineering with a proven record of delivering large-scale solutions.
2+ years of hands-on Python development in production environments.
Expertise in large, complex data environments within enterprise systems.
Proficiency with orchestration tools such as Airflow or Dagster.
Why Join Us
Immediate start with strong potential for extension.
High-impact role influencing mission-critical investment decisions.
Work within a mission-driven organization that values both speed and precision.
Direct opportunity to shape our core data platforms from the ground up.
Data Interface Engineer
Data engineer job in Greenwood Village, CO
Are you interested in leading the transformation of cancer care through putting world-leading scientific data and knowledge in the hands of doctors and other members of the medical team? Do you have a passion for solutions that empower patients to take charge of their care and bring world-class solutions to winning the cancer battle? If so, join our growing team, the company that promises to revolutionize the way cancer care is delivered. We are seeking a highly experienced and motivated Data Interface Engineer to join our growing team of expert interface engineers.
We are seeking a highly experienced and motivated Data Interface Engineer to join our growing team of expert interface engineers. In this role, you will be responsible for developing, monitoring, and maintaining data integration pipelines and interfaces to support healthcare data systems. This includes working with industry-standard protocols like HL7, FHIR, and RESTful APIs, and resolving complex data exchange challenges across EMRs and third-party systems. This role is ideal for someone with deep technical knowledge in data integration engines (preferably Mirth or Iguana), scripting, and healthcare interoperability standards.
Key Responsibilities
· Analysis, design, development, and support of data integrations with EMRs and ancillary support systems.
· Leverage your knowledge of JavaScript to build & modernize data pipelines in Mirth for connecting data sources with VieCure.
· Interpret and implement HL7 interface specifications and EMR integration requirements.
· Interpret business rules and requirements for technical systems.
· Design and develop data exchange workflows using HL7 v2/v3, FHIR, JSON/XML, and APIs.
· Troubleshoot and resolve interface issues to maintain system stability and data accuracy.
· Collaborate with internal teams and external partners to resolve technical integration issues.
· Perform data extraction, transformation, and loading (ETL) tasks for integration projects.
· Build and maintain CCD/CCDA and HL7 data mappings and ensure compliance with business rules.
· Monitor and maintain interface health and operational performance.
· Participate in pre- and post-production support for interface validation and deployment.
· Maintain clear technical documentation for development, troubleshooting, and handoff purposes.
Skills & Experience Requirements
Bachelor's Degree in Computer Science, Information Technology or equivalent.
· 5+ years of hands-on experience in data integration, preferably in healthcare IT.
· Expertise with HL7 v2 messages (ADT, ORM, ORU, SIU) and FHIR protocols.
· Experience with Mirth Connect integration engine preferred. Experience with Iguana considered a bonus.
· Strong proficiency in RESTful API development and data exchange logic.
· Working knowledge of Linux/UNIX and Windows environments (file systems, scripting, SFTP/FTP/HTTP).
· Familiarity with EMR data structures, healthcare ontologies, and standard coding schemes.
· Understanding of HIPAA and healthcare data security requirements.
· Knowledge of cloud computing concepts and integration strategies.
· Strong analytical and problem-solving skills with attention to detail.
· Excellent written and verbal communication skills.
· Ability to handle multiple tasks under tight deadlines and resolve conflicts diplomatically.
Preferred Qualifications
Experience in integration across specialties like Laboratory, Oncology and other clinical domains.
· Project management experience in scoping, implementing, and documenting integration solutions.
· Ability to analyze and improve existing data workflows for better efficiency and scalability.
If you're passionate about healthcare technology and ready to play a pivotal role in integrating complex systems with precision and care, we'd love to hear from you!
Advanced DevOps Engineer
Data engineer job in Boulder, CO
As an Advanced DevOps Engineer, you will:
Contribute to automation that instantiates and maintains container orchestration platforms
Automatically configure and deploy infrastructure services consisting of identity and access management, network architecture, application security, logging, monitoring, deployment and more
Work closely with our team to plan major updates and enhancements to our products
In this role, you will be working with critical internal and external platform consumers. Together with the team, you will support successful implementation of the platform in the Cloud through architecture guidance, best practices, data migration, implementation, troubleshooting, monitoring, and more.
Workplace Options: This position is flex but requires at least 2-3 days per week of onsite support in
Boulder, CO.
Candidates should have demonstrated familiarity and experience in some of the following areas:
Experience working with, or are familiar with the differences between, public (AWS, Azure), private (RHOSP, VIO), and mixed cloud environments
Container technologies such as Docker, or Kubernetes
Deployment or use of continuous integration and continuous deployment technologies such as GitLab, Jenkins, Puppet, or Chef
Experience supporting production-grade applications in virtualized environments (cloud computing platforms and services) using automation
Creation or use of pipelines, Ansible Playbooks, and/or Terraform configurations to automate administrative actions from resource creation to service configuration
Effective communication and attention to detail
Other beneficial experience:
IT security practices including encryption, certificates, key management, patch management, STIG, RMF, or security hardening, and auditing
Installation of Infrastructure as a Service (IaaS) technologies such as Amazon Web Services, OpenStack, RHOSP, or VIO
Management of Cloud Resources in Amazon Web Services, and/or Azure
Platform as a Service (PaaS) technologies such as Kubernetes, Rancher, or OpenShift
Cloud automation tooling such as Ansible, Terraform, Terragrunt, or Packer
Open source server, messaging, load balancing, and database software (such as RabbitMQ, Redis, Elasticsearch)
Scalable networking technologies such as Load Balancers (NGINX)/Firewalls and web standards (REST APIs, web security mechanisms)
EDUCATION REQUIREMENTS: Requires a Bachelor's degree in Software Engineering, or a related Science, Engineering, Technology or Mathematics field. Also requires 5+ years of job-related experience, or a Master's degree plus 3 years of job-related experience. Agile experience preferred.
CLEARANCE REQUIREMENTS: Must hold Department of Defense TS/SCI security clearance to be eligible for consideration. Due to the nature of work performed within our facilities, U.S. citizenship is required.
Theater Engineer
Data engineer job in Colorado Springs, CO
BlueWater Federal is looking for a Theater Manager to support the analysis of user needs and develop the design and associated hardware and software recommendations to support the SEWS program
Responsibilities
• Support the analysis of user needs and develop the design and associated hardware and software recommendations to support those needs.
• Collaborate with SEWS contractor and government personnel to plan routine and emergency trips.
• Provide rotating 24/7 on-call Tier 2 system support for remote users, to identify and resolve hardware, software, and communication issues, document solutions, and develop recommendations to reduce the frequency of repairs.
• Respond to system outages to ensure issues are resolved per contract requirements.
• Support foreign partner system and network installation, maintenance, and sustainment.
• Support Emergency On-Site Sustainment (EOSS) travel to customer locations as required.
• Respond to system component failures or change requests and plan system change or restoral implementation.
• Plan, develop and conduct user training for existing staff as well as new CCMD and FMS users.
• Travel up to 50% in a year to Foreign Partner locations.
• Perform planning and execution for a single or multi-team sustainment and training trip.
• Update Technical Data Package as required to document system.
• Perform on-site sustainment including but not limited to system operational check out, inventory, system updates, equipment firmware updates and documentation updates.
Qualifications
3+ years of experience in systems administration, Tactical Combat Operations, and GCCS
• Must have an active Top Secret clearance with SCI Eligibility
• Knowledge of virtualization concepts and products (VMware); Microsoft Active Directory (AD) for user and groups; Microsoft Operating Systems (Server & Workstation)
• Familiarity with Oracle/Sybase/Postgres database maintenance; Java application servers (Tomcat, JBoss)
• Familiarity with Linux/UNIX applications and services (NFS, SSH, NTP, LDAP, HTTP, Ansible)
• DoD 8570 IAT Level II certification (Security+, CCNA Security, CySA+, GICSP, GSEC, CND, SSCP)
• Partner and Allied nation exercise experience is desired
BlueWater Federal Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
We offer a competitive health and wellness benefits package, including medical, dental, and vision coverage. Our competitive compensation package includes generous 401k matching, employee stock purchase program, and life insurance options, and time off with pay. Salary range: 135-145K
DevOps Engineer
Data engineer job in Broomfield, CO
Job Title : DevOps Engineer - IAM Automation Development
Duration : 6+ months
Responsibilities
Must Have Skills
Skills: Terraform, PowerShell scripting, Python coding
Expertise: Cloud CSP (Azure, AWS/GCP) and IAM implementation and best practice knowledge.
Experience: Large enterprise-scale projects and company experience (10,000+ employees), not small business.
USA Citizen and GSA certification required. Optional industry certifications, practical expertise prioritized.
Experience Level: 3+ years minimum in relevant roles.
Nice to have skills
Implement SLA (4-hour delivery) compliance for automation tickets.
Complete current and upcoming automation projects without disruption.
Implement RBAC and PIM automation as per requirements.
Ensure quality and timely delivery of all milestones.
Job Description:
Key Responsibilities:
Expertise: Cloud CSP (Azure, AWS/GCP) and IAM implementation and best practice knowledge.
Experience: Large enterprise-scale projects and company experience (10,000+ employees), not small business.
USA Citizen and GSA certification required. Optional industry certifications, practical expertise prioritized.
Experience Level: 3+ years minimum in relevant roles.
Please provide the TOP skills, and the years of experience that you'll consider: Terraform, PowerShell scripting, Python coding
Devops Engineer(Only USC & GC)
Data engineer job in Greenwood Village, CO
Hi,
Greetings from Ampstek
Job Title : Devops Engineer
Job Mode : On-site
Experience :10+ years
Job Details:
Must Have Skills: Python, MongoDB
Nice to have skills: Devops
Detailed Job Description:
Development operations (DevOps) engineers are responsible for the production and ongoing maintenance of a website platform. They also manage cloud infrastructure and system administration and work with teams to identify and repair issues on an as-needed basis, so strong communication skills are important in this position. They are generally expected to work well under pressure with tight deadlines for certain tasks, and a proactive demeanor and friendly disposition are also helpful. DevOps engineers may work with junior and senior engineers, project managers, and executives, as well as administrative assistants, executive assistants, and a receptionist. Any experience in Python development, data processing, MongoDB Admin is required. Hours can be flexible, though they typically work during regular weekly business hours, and they are not usually responsible for customer/client interaction or supervising junior employees.
Thanks & Regards
Thomas
******************
Software Engineer
Data engineer job in Colorado Springs, CO
As a software developer on an Internal Research and Development (IRAD) team. You will work in a collaborative environment to understand system requirements, create and implement new capabilities and algorithms. Many of the algorithms and capabilities implemented by the team will be mathematics and physics intensive complex solutions that will be highly critical to the system performance. In addition to software development you will also be expected to support reviews of requirements and test cases that are developed for the software capability.
Basic Qualifications:
Bachelor's degree in STEM related field, and 5 Years with Bachelors in Science; 3 Years with Masters; 1 Year with PhD.
Applicants must have a current, active in-scope DoD-issued Secret security clearance at the time of application, which is required to start
Recent and extensive Experience Developing Software in C++ or Java
Quick to learn and absorb new concepts and information
Recent MATLAB and or Python Experience
Must have an Interim or Active Secret Clearance
Unix/Linux Operating System Experience
Must be able to support an in-person / closed-area work environment
Preferred Qualifications:
Highly experienced with Linux, scripting, and operations
Experience with automated software requirements testing and analysis
Experience with the battle management and or fire control systems
Experience with containerization technologies (e.g., Docker, Kubernetes) and container orchestration.
Experience with Behavior Driven Development (BDD) using tools like Gherkin and Cucumber for automated acceptance testing
Experience with Static & Dynamic Code Analysis Tools and Fuzzing Tools such as: Coverity, Fortify, AND/OR SonarQube
Experience developing software in an Model Based Systems Engineering (MBSE) environment.
Experience with CI/CD, containers, and pipelines.
Experience with Software Change Control, Change Management, Code Quality, Static Analysis, and CI/CD tools such as: Atlassian tool suite, Jira, GitHub, GitLab, SonarQube, Coverity, and Jenkins.
Very solid background in math and physics
Advanced degree in Mathematics or Physics or Computer Science.
DevOps Engineer
Data engineer job in Denver, CO
As a Senior DevOps Engineer at Vertex IT Systems, you will play a crucial role in designing, implementing, and maintaining infrastructure that supports our applications and services. You will work closely with our development, operations, and security teams to ensure scalable, reliable, and secure environments. This role is ideal for someone with a strong technical background in Linux OS, AWS, and DevOps tools like GitLab, Terraform, and Kubernetes.
Job Responsibilities:
Design, build, and maintain cloud infrastructure, ensuring high availability and performance.
Collaborate with development teams to streamline CI/CD pipelines using tools like GitLab.
Manage and optimize AWS environments (AWS certifications preferred but not required).
Implement Infrastructure as Code (IaC) solutions using Terraform (preferred) for provisioning infrastructure.
Automate system configuration, application deployment, and scaling using programming languages such as Python, Node.js, or Golang (preferred).
Manage Kubernetes clusters (preferred) to orchestrate containerized applications.
Monitor, troubleshoot, and enhance system performance and reliability.
Provide expertise on network fundamentals (TCP/IP layers), ensuring network security and efficiency.
Continuously improve infrastructure security, performance, and scalability. Job Skills/Qualification
Requirements:
Bachelor s Degree in Computer Science, Software Engineering, or a related field.
Strong knowledge of network fundamentals including TCP/IP layers.
Hands-on experience with Linux OS administration.
Proficiency in AWS with practical experience (AWS certifications are a plus but not required).
Experience with programming languages like Python, Node.js, and Golang (preferred).
Expertise in using GitLab for version control and CI/CD pipelines (required).
Familiarity with Infrastructure as Code (IaC) tools, especially Terraform (preferred).
Practical experience with Kubernetes for container orchestration (preferred).
Strong problem-solving skills and a passion for automation and process improvement. Nice to Have:
Experience with other cloud platforms like Google Cloud Platform or Azure.
Understanding of monitoring tools (e.g., Prometheus, Grafana).
Knowledge of security best practices in DevOps and cloud environments.
Software Engineer
Data engineer job in Aurora, CO
Job Title: Software Engineer
Clearance: Active & transferable TS/SCI with Polygraph
Salary: $130,000 - $145,000
Benefits:
This role includes Medical, Dental, and Vision coverage, 6 weeks of paid PTO, Short-Term Disability, a 401(k) with 5% company match, and up to $8,000 in relocation assistance. Full Benefits package can be sent upon request.
About the Company
This role is being managed by Talent Stack LLC, a woman-owned recruiting firm specializing in cleared and technical placements. We proudly represent a confidential defense-focused client supporting critical national security programs.
About the Role
We are seeking a Software Engineer to support mission-critical ground systems for a leading defense customer. In this role, you will evaluate operational software issues, provide rapid solutions, and support integration and verification activities for new software baselines. This position is hands-on and ideal for engineers who thrive in real-time, high-impact operational environments.
What You Will Do
Support troubleshooting, maintenance, and documentation of operational software applications.
Troubleshoot and isolate sources of system errors or unexpected performance.
Execute testing, integration, and delivery of new software baselines.
Perform in-depth analysis across complex, end-to-end operational systems.
Work onsite within a secure environment supporting mission operations.
Required Qualifications
Bachelor's degree in STEM.
2+ years of professional level experience at a government contractor or agency.
Experience coding in Java or C++.
Hands-on experience with software development and/or software integration.
Active, transferable TS/SCI with Polygraph required prior to start.
Preferred Qualifications
Experience with Git, Bitbucket, or similar configuration management tools.
Ability to work in a structured configuration-controlled O&M environment supporting real-time applications.
Experience with Linux/Unix, GIT, and SQL.
Familiarity with JIRA, BitBucket, and Confluence.
This role requires an active and transferable DoD Top Secret security clearance, with the ability to obtain and maintain TS/SCI access and polygraph. Only U.S. citizens are eligible for security clearance. Candidates without the required clearance cannot be considered for this position.
Recruitment Agencies Disclaimer
This position is posted for direct applicants only. We kindly request that external recruiters, staffing agencies, or search firms refrain from submitting candidates. Unsolicited resumes will be treated as direct applications, and no agency fees will be honored.
Reference Number: X15
Staff Software Engineer - Filesystems and Data Protocols (C++)
Data engineer job in Denver, CO
Job Title: Staff Software Engineer - Filesystems and Data Protocols
Job Type: 6 mths CTH or Direct FTE, both okay.
Responsibilities:
We are looking for a highly skilled Staff Software Engineer with specialization in Filesystems and Data Protocols to join the team. In this pivotal role, you will design, develop, and optimize the core components of Infinia, Client's advanced distributed intelligent data platform.
You will leverage cutting-edge AI development and deployment tooling-prompt engineering, to reduce time-to-insight, accelerate delivery velocity for better Business outcomes.
Key Responsibilities:
Design and implement Data Orchestration layer on top of distributed Filesystem. This module will integrate Infinia on premise solution with multi cloud Data Services.
Design and enhance the core components of Infinia for high performance and scalability.
Own critical customer case escalations end-to-end, including deep root cause analysis and mitigation strategies.
Drive design discussions, build prototypes and contribute to deliver high quality products
Utilize AI-powered debugging, log analysis, and system pattern recognition tools to accelerate resolution.
Conduct code reviews and improve scalability, stability, reliability and performance of INFINIA solutions
Required Qualifications:
Bachelor's or master's degree in computer science, Software Engineering, or a related field.
8+ years of Data Storage and management, software engineering experience.
Strong expertise in C++
Preferred Skills:
Experience with cloud Data Storage services like AWS S3 or Google Cloud Storage (GCS).
Data movement, copy workflows: remote replication, cloud backup/restore
Experience with real-time data processing and streaming frameworks such as Apache Kafka.
Experience with Distributed systems
About Maxonic:
Since 2002 Maxonic has been at the forefront of connecting candidate strengths to client challenges. Our award winning, dedicated team of recruiting professionals are specialized by technology, are great listeners, and will seek to find a position that meets the long-term career needs of our candidates. We take pride in the over 10,000 candidates that we have placed, and the repeat business that we earn from our satisfied clients.
Interested in Applying?
Please apply with your most current resume. Feel free to contact Nina Schindler (**************** / ************** for more details.
Graduate Software Engineer - AI Start-Up
Data engineer job in Colorado Springs, CO
Targeting Top Computer Science Universities
🚀
Start your career building production AI - not side projects.
We're a fast-growing, venture-backed company redesigning how the world moves goods through intelligent transportation systems. Our tech is deployed in operational environments - solving real, high-stakes problems in logistics and mobility at scale.
Join us as a Graduate Engineer and take true ownership from day one. You'll help build the backbone of a platform used globally, working alongside experienced engineers in a deeply technical environment.
What You'll Do
Build full-stack features using TypeScript (React/Next.js) and Python (FastAPI/Django)
Develop scalable APIs and backend services powering AI-driven decision systems
Productionise machine learning models into real-world workflows
Optimise performance, reliability, and system efficiency
Ship quickly in a collaborative, high-accountability engineering culture
What We're Looking For
Final-year or recent CS graduates from top-ranked global universities
Strong fundamentals: algorithms, data structures, distributed systems
Hands-on experience in Python and/or TypeScript from internships or advanced projects
Evidence of excellence: open-source, research, hackathons, impactful university work
Curious, ambitious engineers who want to solve hard problems at scale
Ability to work onsite in Denver at least 3 days/week
Why Join
Work on production AI infrastructure
Immediate impact and early ownership
Mentorship from senior engineers solving complex system challenges
Build technology shaping the future of global transportation
Graduate Software Engineer - AI Start-Up
Data engineer job in Colorado Springs, CO
Targeting Top Computer Science Universities
🚀
Start your career building production AI - not side projects.
We're a fast-growing, venture-backed company redesigning how the world moves goods through intelligent transportation systems. Our tech is deployed in operational environments - solving real, high-stakes problems in logistics and mobility at scale.
Join us as a Graduate Engineer and take true ownership from day one. You'll help build the backbone of a platform used globally, working alongside experienced engineers in a deeply technical environment.
What You'll Do
Build full-stack features using TypeScript (React/Next.js) and Python (FastAPI/Django)
Develop scalable APIs and backend services powering AI-driven decision systems
Productionise machine learning models into real-world workflows
Optimise performance, reliability, and system efficiency
Ship quickly in a collaborative, high-accountability engineering culture
What We're Looking For
Final-year or recent CS graduates from top-ranked global universities
Strong fundamentals: algorithms, data structures, distributed systems
Hands-on experience in Python and/or TypeScript from internships or advanced projects
Evidence of excellence: open-source, research, hackathons, impactful university work
Curious, ambitious engineers who want to solve hard problems at scale
Ability to work onsite in Denver at least 3 days/week
Why Join
Work on production AI infrastructure
Immediate impact and early ownership
Mentorship from senior engineers solving complex system challenges
Build technology shaping the future of global transportation