Platform Engineer
Richmond, VA jobs
We are seeking a highly skilled and experienced Software Systems Engineer who will be at the forefront of building our automation platform ecosystem - transforming the way we deliver IT infrastructure and services.
The successful candidate will be responsible for designing, implementing, and maintaining
our automation and orchestration platforms, ensuring their optimal performance, scalability, and reliability in a dynamic and fast-paced environment.
The candidate will also be a member of a larger platform team and will assist with managing and troubleshooting infrastructure issues related to server OS, virtualization, and container orchestration platforms.
This role is ideal for someone who thrives on building systems from the ground up, enjoys solving complex operational challenges, and has a passion for enabling others through automation.
Key Responsibilities
• Design, deploy, and administer automation platforms such as but not limited to Terraform Enterprise, Ansible Automation Platform, Vault, and Packer.
• Collaborate with development, operations, security, and COE teams to ensure seamless integration and secure & consistent automation practices.
• Establish and develop operational standards, documentation, and lifecycle management processes.
• Integrate self-service, CMDB, platform security, secrets management, observability, and other solutions.
• Monitor system performance, troubleshoot issues, and optimize the platform for high availability and resilience.
• Implement and manage CI/CD pipelines and GitOps workflows using tools such as GitLab, Jenkins, etc.
• Provide guidance and training to other engineers on automation platforms and related technologies and develop related documentation.
• Stay current with industry trends, emerging technologies, and best practices related to automation platforms, VMs, containerization, and cloud-native architectures.
• Provide supplemental VMWare & Kubernetes/Container Support: troubleshooting issues, deployment and configuration, storage and performance monitoring, and performing security updates.
• Participate in a 24/7 on-call rotation and respond to issues with systems and technologies supported by the team.
Required Skills
• Proven expertise in automation platform deployment and administration (Terraform, Ansible, Packer, Vault, etc.).
• Strong understanding of platform automation architecture, components, and ecosystem, including hands-on experience.
• Automation pipeline development and CI/CD integration.
• Scripting and troubleshooting proficiency (Python, PowerShell, Bash, etc.).
• System reliability and observability (Prometheus, Grafana, etc.).
• Security and access management (SSO, RBAC, PKI).
• Strong problem-solving skills, with a proactive and collaborative approach to troubleshooting and issue resolution.
• Background in infrastructure lifecycle management and capacity planning.
• Solid foundation in infrastructure including understanding of database, networking, DNS, load balancing, storage, and backup concepts and solutions.
• Excellent interpersonal, communication, organizational and technical leadership skills.
Required Years of Experience
• Minimum of 5 -8 years of experience in software systems engineering, with a focus on infrastructure engineering, DevOps, or platform operations.
• Minimum of 2 years of hands-on experience administering automation or IaC platforms (Terraform, Ansible, etc.).
Endpoint Engineer
Denver, CO jobs
The Endpoint Engineer is responsible for deploying, managing, and supporting endpoint technologies across clinical and administrative environments. This role ensures that all devices meet organizational standards for performance, security, and compliance while supporting the unique workflows of a healthcare setting.
Knowledge, Skills, and Abilities
Proficiency in Microsoft 365, Windows Operating Systems, Intune, SCCM, Active Directory, and Group Policy.
Experience with mobile device management (MDM/MAM), such as Intune, JAMF, or Citrix XenMobile.
Familiarity with clinical workflows and healthcare endpoint technologies.
Strong troubleshooting skills across hardware, operating systems, and application environments.
Essential Functions
Endpoint Deployment & Configuration
Build, configure, and deploy desktops, laptops, tablets, thin clients, and mobile devices using enterprise imaging and management tools.
Ensure devices meet hospital security, compliance, and operational standards prior to release.
Advanced Technical Support
Provide Tier II/III support for escalated incidents, service requests, and problem management activities.
Troubleshoot complex hardware, software, and peripheral issues in clinical and administrative environments.
Partner with EUC Support and Help Desk teams to ensure proper issue resolution and knowledge transfer.
Systems & Application Management
Administer and support Microsoft 365, Intune, SCCM, and other enterprise endpoint management solutions.
Package, test, and deploy applications, patches, and updates to the endpoint environment.
Maintain and optimize Group Policy Objects (GPOs), login scripts, and device configurations.
Healthcare Technology Integration
Support clinical workstations, Workstations on Wheels (WoWs), printers, and medical-grade peripherals.
Collaborate with clinical application teams (e.g., Epic Analysts, Nursing Informatics, Clinical Applications) to ensure endpoint compatibility and workflow support.
Security & Compliance
Enforce endpoint security policies, including encryption, EDR, conditional access, and mobile device management.
Participate in vulnerability remediation, patch management, and compliance reporting activities.
Support HIPAA and regulatory requirements by maintaining endpoint security and data protection measures.
Project Participation & Execution
Contribute to enterprise IT projects such as OS upgrades, device lifecycle refreshes, and mobility initiatives.
Assist with design, testing, and rollout of new endpoint standards, technologies, and configurations.
Document project deliverables, lessons learned, and standard operating procedures.
Documentation & Knowledge Sharing
Develop and maintain technical documentation, SOPs, and knowledge base articles.
Provide technical training and guidance to junior staff and end users as needed.
Continuous Improvement
Monitor system performance, identify recurring issues, and recommend improvements.
Stay current with emerging EUC technologies, industry best practices, and healthcare IT trends.
Cybersecurity Engineer
Glen Allen, VA jobs
Cybersecurity Engineer
Compensation: $50 - $60 /hour, depending on experience
Inceed has partnered with a great company to help find a skilled Cybersecurity Engineer to join their team!
Join a dynamic team as a Cybersecurity Engineer, where you'll independently execute significant cybersecurity projects across the U.S. and Canada. This opportunity arises from the need for local talent in Virginia to drive innovative cybersecurity solutions at various client sites. Embrace the chance to work hands-on, ensuring the highest level of security integrity while collaborating with industry leaders.
Key Responsibilities & Duties:
Execute network penetration testing and vulnerability assessments
Design and implement secure systems and cybersecurity programs
Conduct detailed post-event analysis and recommend procedural changes
Develop policies for secure network design and firewall implementation
Collaborate with other divisions to enhance cybersecurity measures
Maintain confidentiality and security of client information
Resolve technical issues and communicate implications effectively
Perform cybersecurity compliance and regulatory standards planning
Compile technical documentation and network traffic explanations
Required Qualifications & Experience:
Bachelor's Degree in Cybersecurity, Computer Science, or related field
Minimum 3 years of relevant experience
Advanced knowledge of cybersecurity principles and technologies
Experience with vulnerability assessments and penetration tests
Strong communication and analytical skills
Ability to work under pressure and tight deadlines
Knowledge of control systems used by utilities and smart cities
Nice to Have Skills & Experience:
Industry-recognized IT certifications in ethical hacking or network engineering
Experience with SCADA systems and risk management frameworks
Perks & Benefits:
3 different medical health insurance plans, dental, and vision insurance
Voluntary and Long-term disability insurance
Paid time off, 401k, and holiday pay
Weekly direct deposit or pay card deposit
If you are interested in learning more about the Cybersecurity Engineer opportunity, please submit your resume for consideration. Our client is unable to provide sponsorship at this time.
We are Inceed, a staffing direct placement firm who believes in the possibility of something better. Our mission is simple: We're here to help every person, whether client, candidate, or employee, find and secure what's better for them.
Inceed is an equal opportunity employer. Inceed prohibits discrimination and harassment of any type and affords equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity, or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
#IND
Cloud Engineer
Frankfort, KY jobs
Cloud Engineer (Longterm Contract)
One of our client's key agencies is embarking on a major IT modernization initiative to migrate data and systems from one agency to theirs, moving away from mainframe environments and toward a secure, in-house cloud solution. We are seeking a highly skilled Cloud Engineer-with near-architect level expertise-to join this transformation effort.
Key Responsibilities:
Collaborate with a system integrator and internal IT teams to design and implement cloud migration strategies.
Assess current AWS and Azure environments and recommend best-fit solutions for AGO's new cloud infrastructure.
Provide architectural insights and guidance to ensure scalability, security, and compliance.
Work autonomously to evaluate technologies and propose optimal configurations.
Support the transition from mainframe systems to modern cloud platforms.
Qualifications:
Extensive experience with AWS and Microsoft Azure cloud environments.
Strong background in cloud architecture, migration planning, and implementation.
Ability to work independently and communicate effectively with technical and non-technical stakeholders.
Familiarity with government IT environments and compliance standards is a plus.
Contract Details:
Duration: Multiyear Project
Start Date: Early 2026
DevOps Engineer | Machine Learning Platforms
Pittsburgh, PA jobs
MLOps Engineer
Remote | Pittsburgh, PA area
On-site: 1 day/month
eNGINE builds Technical Teams. We are a Solutions and Placement firm shaped by decades of interaction with Technical professionals. Our inspiration is continuous learning and engagement with the markets we serve, the talent we represent, and the teams we build. Our Consulting Workforce is encouraged to enjoy career fulfillment in the form of challenging projects, schedule flexibility, and paid training/certifications. Successful outcomes start and finish with eNGINE.
Role Overview
eNGINE is seeking a MLOps Engineer to manage and scale machine learning workflows from development to production. This role ensures that models are robust, maintainable, and performant in real-world environments, while collaborating closely with Data Science and Engineering teams to integrate ML solutions into digital products.
Key Responsibilities
Implement end-to-end ML deployment strategies to move models from development to production reliably
Configure and manage scalable, cloud-based infrastructure for ML workloads
Track and analyze model behavior and operational metrics to ensure consistent performance
Establish automated processes for retraining, versioning, and releasing ML models
Work closely with cross-functional teams to embed machine learning capabilities into applications and platforms
Review and refine system architecture and pipelines to improve latency, throughput, and resource utilization
Maintain documentation and operational standards for reproducible, production-ready ML systems
Identify and apply new tools and technologies to streamline ML operations and reduce maintenance overhead
Required Qualifications
Bachelor's degree
Experience deploying machine learning solutions in production environments
Strong Python skills, including experience with numerical and ML libraries (NumPy, Pandas, scikit-learn) and at least one deep learning framework (PyTorch or TensorFlow)
Experience with containerization and orchestration technologies such as Docker and Kubernetes
Knowledge of cloud platforms (AWS, GCP, or Azure) and Infrastructure-as-Code tools
Familiarity with ML workflow management or experiment tracking tools (MLflow, Kubeflow, or similar)
Understanding of software engineering best practices, including version control, testing, and documentation
Preferred Experience
Prior involvement in building or supporting ML-driven digital products
Experience optimizing ML pipelines for cost, performance, and scalability
Collaborative experience with cross-functional engineering and data teams
Practical exposure to monitoring, alerting, and incident response for ML systems
Location & Work Environment
Fully remote, with monthly on-site meetings in the Pittsburgh, PA area
Next Steps
For finer details on how eNGINE can enhance your career, apply today!
No C2C, third-party candidates, relocation assistance, or sponsorship available for this role.
Theater Engineer
Colorado Springs, CO jobs
BlueWater Federal is looking for a Theater Manager to support the analysis of user needs and develop the design and associated hardware and software recommendations to support the SEWS program
Responsibilities
• Support the analysis of user needs and develop the design and associated hardware and software recommendations to support those needs.
• Collaborate with SEWS contractor and government personnel to plan routine and emergency trips.
• Provide rotating 24/7 on-call Tier 2 system support for remote users, to identify and resolve hardware, software, and communication issues, document solutions, and develop recommendations to reduce the frequency of repairs.
• Respond to system outages to ensure issues are resolved per contract requirements.
• Support foreign partner system and network installation, maintenance, and sustainment.
• Support Emergency On-Site Sustainment (EOSS) travel to customer locations as required.
• Respond to system component failures or change requests and plan system change or restoral implementation.
• Plan, develop and conduct user training for existing staff as well as new CCMD and FMS users.
• Travel up to 50% in a year to Foreign Partner locations.
• Perform planning and execution for a single or multi-team sustainment and training trip.
• Update Technical Data Package as required to document system.
• Perform on-site sustainment including but not limited to system operational check out, inventory, system updates, equipment firmware updates and documentation updates.
Qualifications
3+ years of experience in systems administration, Tactical Combat Operations, and GCCS
• Must have an active Top Secret clearance with SCI Eligibility
• Knowledge of virtualization concepts and products (VMware); Microsoft Active Directory (AD) for user and groups; Microsoft Operating Systems (Server & Workstation)
• Familiarity with Oracle/Sybase/Postgres database maintenance; Java application servers (Tomcat, JBoss)
• Familiarity with Linux/UNIX applications and services (NFS, SSH, NTP, LDAP, HTTP, Ansible)
• DoD 8570 IAT Level II certification (Security+, CCNA Security, CySA+, GICSP, GSEC, CND, SSCP)
• Partner and Allied nation exercise experience is desired
BlueWater Federal Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
We offer a competitive health and wellness benefits package, including medical, dental, and vision coverage. Our competitive compensation package includes generous 401k matching, employee stock purchase program, and life insurance options, and time off with pay. Salary range: 135-145K
Data Engineer
Richmond, VA jobs
Data Engineer - Distributed Energy Resources (DER)
Richmond, VA - Hybrid (1 week on - 1 week off)
12-month contract (Multiple Year Project)
$45-55/hr. depending on experience
We are hiring a Data Integration Engineer to join one of our Fortune 500 utilities partners in the Richmond area! In this role, you will support our client's rapidly growing Distributed Energy Resources (DER) and Virtual Power Plant (VPP) initiatives. You will be responsible for integrating data across platforms such as Salesforce, GIS, SAP, Oracle, and Snowflake to build our client's centralized asset tracking system for thermostats, EV chargers, solar assets, home batteries, and more.
In this role, you will map data, work with APIs, support Agile product squads, and help design system integrations that enable our client to manage customer energy assets and demand response programs at scale. This is a highly visible position on a brand-new product team with the chance to work on cutting-edge energy and utility modernization efforts. If you are interested, please apply!
MINIMUM QUALIFICATIONS:
3-5 years of experience in system integration, data engineering, or data warehousing and Bachelor's degree in Computer Science, Engineering, or related technical discipline.
Hands-on experience working with REST APIs and integrating enterprise systems.
Strong understanding of data structures, data types, and data mapping.
Familiarity with Snowflake or similar data warehousing platform.
Experience connecting data across platforms and/or integrating data from a variety of sources, i.e. SAP, Oracle, etc.
Ability to work independently and solve problems in a fast-paced Agile environment.
Excellent communication skills with the ability to collaborate across IT, business, engineering, and product teams.
RESPONSIBILITIES:
Integrate and map data across Salesforce, GIS, Snowflake, SAP, Oracle, and other enterprise systems
Link distributed energy asset data (EV chargers, thermostats, solar, home batteries, etc.) into a unified asset tracking database
Support API-first integrations: consuming, analyzing, and working with RESTful services
Participate in Agile ceremonies and work through user stories in Jira
Collaborate with product owners, BAs, data analysts, architects, and engineers to translate requirements into actionable technical tasks
Support architecture activities such as identifying data sources, formats, mappings, and integration patterns
Help design and optimize integration workflows across new and existing platforms
Work within newly formed Agile product squads focused on VPP/Asset Tracking and Customer Segmentation
Troubleshoot integration issues and identify long-term solutions
Contribute to building net-new systems and tools as the client expands DER offerings
NICE TO HAVES:
Experience with Salesforce.
Experience working with GIS systems or spatial data.
Understanding customer enrollment systems.
Jira experience.
WHAT'S IN IT FOR YOU…?
Joining our client provides you the opportunity to join a brand-new Agile product squad, work on high-impact energy modernization and DER initiatives, and gain exposure to new technologies and integration tools. This is a long-term contract with strong likelihood of extension in a stable industry and company.
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state, and local laws.
Senior Data Engineer
Bethlehem, PA jobs
Hybrid (Bethlehem, PA)
Contract
Applicants must be authorized to work in the U.S. without sponsorship
We're looking for a Senior Data Engineer to join our growing technology team and help shape the future of our enterprise data landscape. This is a hands-on, high-impact opportunity to make recommendations, build and evolve a modern data platform using Snowflake and cloud-based EDW Solutions.
How You'll Impact Results:
Drive the evolution and architecture of scalable, secure, cloud-native data platforms
Design, build, and maintain data models, pipelines, and integration patterns across the data lake, data warehouse, and consumption layers
Lead deployment of long-term data products and infuse data and analytics capabilities across business and IT
Optimize data pipelines and warehouse performance for accuracy, accessibility, and speed
Collaborate cross-functionally to deliver data, experimentation, and analytics solutions
Implement systems to monitor data quality and ensure reliability and availability of Production data for downstream users, leadership teams, and business processes
Recommend and implement best practices for query performance, storage, and resource efficiency
Test and clearly document data assets, pipelines, and architecture to support usability and scale
Engage across project phases and serve as a key contributor in strategic data architecture initiatives
Your Qualifications That Will Ensure Success:
Required:
10+ years of experience in Information Technology Data Engineering:
professional database and data warehouse development
Advanced proficiency in SQL, data modeling, and performance tuning
Experience in system configuration, security administration, and performance optimization
Deep experience required with Snowflake and modern cloud data platforms (AWS, Azure, or GCP)
Familiarity with developing cloud data applications (AWS, Azure, Google Cloud) and/or standard CI/CD tools like Azure DevOps or GitHub
Strong analytical, problem-solving, and documentation skills
Experience in system configuration, security administration, and performance optimization
Proficiency with Microsoft Excel and common data analysis tools
Ability to troubleshoot technical issues and provide system support to non-technical users.
Preferred:
Experience integrating SAP ECC data into cloud-native platforms
Exposure to AI/ML, API development, or Boomi Atmosphere
Prior experience in consumer packaged goods (CPG), Food / Beverage industry, or manufacturing
Data Engineer
Lancaster, PA jobs
Contract-to-Hire (6 months)
Lancaster, PA
We are seeking a Data Engineer to design, build, and maintain scalable data pipelines and data infrastructure that support analytics, AI, and data-driven decision-making. This role is hands-on and focused on building reliable, well-modeled datasets across modern cloud and lakehouse platforms. You will partner closely with analytics, data science, and business teams to deliver high-quality data solutions.
Key Responsibilities
Design, build, and optimize batch and real-time data pipelines for ingestion, transformation, and delivery
Develop and maintain data models that support analytics, reporting, and machine learning use cases
Design scalable data architecture integrating structured, semi-structured, and unstructured data sources
Build and support ETL / ELT workflows using modern tools (e.g., dbt, Airflow, Databricks, Glue)
Ingest and integrate data from multiple internal and external sources, including APIs, databases, and cloud services
Manage and optimize cloud-based data platforms (AWS, Azure, or GCP), including lakehouse technologies such as Snowflake or Databricks
Implement data quality, validation, governance, lineage, and monitoring processes
Support advanced analytics and machine learning data pipelines
Partner with analysts, data scientists, and stakeholders to deliver trusted, well-structured datasets
Continuously improve data workflows for performance, scalability, and cost efficiency
Contribute to documentation, standards, and best practices across the data engineering function
Required Qualifications
3-7 years of experience in data engineering or a related role
Strong proficiency in SQL and at least one programming language (Python, Scala, or Java)
Hands-on experience with modern data platforms (Snowflake, Databricks, or similar)
Experience building and orchestrating data pipelines in cloud environments
Working knowledge of cloud services (AWS, Azure, or GCP)
Experience with version control, CI/CD, and modern development practices
Strong analytical, problem-solving, and communication skills
Ability to work effectively in a fast-paced, collaborative environment
Preferred / Nice-to-Have
Experience with dbt, Airflow, or similar orchestration tools
Exposure to machine learning or advanced analytics pipelines
Experience implementing data governance or quality frameworks
Familiarity with SAP data platforms (e.g., BW, Datasphere, Business Data Cloud)
Experience using LLMs or AI-assisted tooling for automation, documentation, or data workflows
Relevant certifications in cloud, data platforms, or AI technologies
Data Engineer
Denver, CO jobs
Data Engineer
Compensation: $80 - $90/hour, depending on experience
Inceed has partnered with a great company to help find a skilled Data Engineer to join their team!
Join a dynamic team as a contract Data Engineer, where you'll be the backbone of data-driven operations. This role offers the opportunity to work with a modern tech stack in a hybrid on-prem and cloud environment. You'll design and implement innovative solutions to complex challenges, collaborating with data scientists, location intelligence experts, and ML engineers. This exciting opportunity has opened due to a new project initiative and you'll be making a tangible impact.
Key Responsibilities & Duties:
Design and deploy scalable data pipelines and architectures
Collaborate with stakeholders to deliver high-impact data solutions
Integrate data from multiple sources ensuring quality and reliability
Develop automation workflows and BI solutions
Mentor others and contribute to the knowledge base
Explore and implement emerging technologies
Required Qualifications & Experience:
8+ years of experience in data engineering
Experience with large oil and gas datasets
Proficiency in SQL and Python
Hands-on experience in cloud environments (Azure, AWS, or GCP)
Familiarity with Apache Kafka, Apache Flink, or Azure Event Hubs
Nice to Have Skills & Experience:
Experience with Palantir Foundry
Knowledge of query federation platforms
Experience with modern data stack tools like dbt or Airflow
Perks & Benefits:
3 different medical health insurance plans, dental, and vision insurance
Voluntary and Long-term disability insurance
Paid time off, 401k, and holiday pay
Weekly direct deposit or pay card deposit
If you are interested in learning more about the Data Engineer opportunity, please submit your resume for consideration. Our client is unable to provide sponsorship at this time.
We are Inceed, a staffing direct placement firm who believes in the possibility of something better. Our mission is simple: We're here to help every person, whether client, candidate, or employee, find and secure what's better for them.
Inceed is an equal opportunity employer. Inceed prohibits discrimination and harassment of any type and affords equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity, or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
Data Engineer
Denver, CO jobs
Data Engineer
Compensation: $ 80 - 90 /hour, depending on experience
Inceed has partnered with a great energy company to help find a skilled Data Engineer to join their team!
Join a dynamic team where you'll be at the forefront of data-driven operations. This role offers the autonomy to design and implement groundbreaking data architectures, working primarily remotely. This position is open due to exciting new projects. You'll be collaborating with data scientists and engineers, making impactful contributions to the company's success.
Key Responsibilities & Duties:
Design and deploy scalable data pipelines and architectures
Collaborate with stakeholders to deliver high-impact data solutions
Integrate data from various sources ensuring consistency and reliability
Develop automation workflows and BI solutions
Mentor others and advise on data process best practices
Explore and implement emerging technologies
Required Qualifications & Experience:
8+ years of data engineering experience
Experience with PI
Experience with SCADA
Experience with Palantir
Experience with large oil and gas datasets
Proficiency in Python and SQL
Hands-on experience in cloud environments (Azure, AWS, GCP)
Nice to Have Skills & Experience:
Familiarity with Apache Kafka or Flink
Perks & Benefits:
3 different medical health insurance plans, dental, and vision insurance
Voluntary and Long-term disability insurance
Paid time off, 401k, and holiday pay
Weekly direct deposit or pay card deposit
If you are interested in learning more about the Data Engineer opportunity, please submit your resume for consideration. Our client is unable to provide sponsorship at this time.
We are Inceed, a staffing direct placement firm who believes in the possibility of something better. Our mission is simple: We're here to help every person, whether client, candidate, or employee, find and secure what's better for them.
Inceed is an equal opportunity employer. Inceed prohibits discrimination and harassment of any type and affords equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity, or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
Data Engineer
Hamilton, NJ jobs
Key Responsibilities:
Manage and support batch processes and data pipelines in Azure Databricks and Azure Data Factory.
Integrate and process Bloomberg market data feeds and files into trading or analytics platforms.
Monitor, troubleshoot, and resolve data and system issues related to trading applications and market data ingestion.
Develop, automate, and optimize ETL pipelines using Python, Spark, and SQL.
Manage FTP/SFTP file transfers between internal systems and external vendors.
Ensure data quality, completeness, and timeliness for downstream trading and reporting systems.
Collaborate with operations, application support, and infrastructure teams to resolve incidents and enhance data workflows.
Required Skills & Experience:
10+ years of experience in data engineering or production support within financial services or trading environments.
Hands-on experience with Azure Databricks, Azure Data Factory, and Azure Storage, Logic Apps, Fabric.
Strong Python and SQL programming skills.
Experience with Bloomberg data feeds (BPIPE, TSIP,SFTP).
Experience with Git, CI/CD pipelines, and Azure DevOps.
Proven ability to support batch jobs, troubleshoot failures, and manage job scheduling.
Experience handling FTP/SFTP file transfers and automation (e.g., using scripts or managed file transfer tools).
Solid understanding of equities trading, fixed income trading, trading workflows, and financial instruments.
Excellent communication, problem-solving, and stakeholder management skills.
IT Data Engineer
Lakewood, CO jobs
IT Data Engineer
Compensation: $125k-$155k (DOE)
Inceed has partnered with a great company to help find a skilled IT Data Engineer to join their team!
Join a dynamic team where innovation meets opportunity. This role is pivotal in advancing AI and data modernization initiatives, bridging traditional database administration with cutting-edge AI data infrastructure. The team thrives on collaboration and offers a hybrid work schedule.
Key Responsibilities & Duties:
Design and maintain scalable data pipelines.
Develop RAG workflows for AI information access.
Build secure connectors and APIs for data retrieval.
Monitor and optimize data flows for consistency.
Lead database administration and performance tuning.
Manage database upgrades and storage optimization.
Implement database security controls and standards.
Support application integrations and data migrations.
Define and maintain data models and metadata.
Collaborate with teams to ensure compliance requirements.
Required Qualifications & Experience:
Bachelor's degree in Computer Science or related field.
7+ years in database administration or data engineering.
Advanced SQL and data modeling skills.
Experience with AI and analytics data pipelines.
Familiarity with cloud-based data ecosystems.
Hands-on experience with RAG and vectorization.
Proficiency in scripting languages like Python.
Experience leading vendor-to-internal transitions.
Nice to Have Skills & Experience:
Experience integrating enterprise systems into data platforms.
Knowledge of data governance frameworks.
Understanding of semantic data modeling.
Experience with cloud migration of database workloads.
Perks & Benefits:
This opportunity includes a comprehensive and competitive benefits package-details will be shared during later stages of the hiring process.
Other Information:
Hybrid work schedule
This position requires a background check and drug test
If you are interested in learning more about the IT Data Engineer opportunity, please submit your resume for consideration. Our client is unable to provide sponsorship at this time.
We are Inceed, a staffing direct placement firm who believes in the possibility of something better. Our mission is simple: We're here to help every person, whether client, candidate, or employee, find and secure what's better for them.
Inceed is an equal opportunity employer. Inceed prohibits discrimination and harassment of any type and affords equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity, or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
Senior Data Engineer
Indianapolis, IN jobs
Senior Data Engineer - Azure Data Warehouse (5-7+ Years Experience)
Long term renewing contract
Azure-based data warehouse and dashboarding initiatives.
Work alongside architects, analysts, and researchers to build scalable, auditable, and business-aligned data assets using modern cloud tools and best practices.
Key Responsibilities
· Design and implement scalable data pipelines using ADF, Databricks, and Azure SQL Server
· Apply Medallion architecture principles and best practices for data lake and warehouse design
· Collaborate with Data Architects, Analysts, and Researchers to translate business needs into technical solutions
· Develop and maintain CI/CD pipelines for data workflows and dashboard deployments
· Lead troubleshooting and debugging efforts across ETL, SQL, and cloud environments
· Mentor junior team members and promote best practices in data modeling, cleansing, and promotion
· Support dashboarding initiatives with Power BI and wireframe collaboration
· Ensure auditability, lineage, and performance across SQL Server and Oracle environments
Required Skills & Experience
· 5-7+ years in data engineering, data warehouse design, and ETL development
· Strong expertise in Azure Data Factory, Data Bricks, and Python
· Deep understanding of SQL Server, Oracle, Postgres SQL & Cosmos DB and data modeling standards
· Proven experience with Medallion architecture and data Lakehouse best practices
· Hands-on with CI/CD, DevOps, and deployment automation
· Agile mindset with ability to manage multiple priorities and deliver on time
· Excellent communication and documentation skills
Bonus Skills
· Experience with GCP or AWS
· Familiarity with Jira, Confluence, and AppDynamics
Data Engineer (Zero Trust)
Fort Belvoir, VA jobs
Kavaliro is seeking a Zero Trust Security Architect / Data Engineer to support a mission-critical program by integrating secure architecture principles, strengthening data security, and advancing Zero Trust initiatives across the enterprise.
Key Responsibilities
Develop and implement program protection planning, including IT supply chain security, anti-tampering methods, and risk management aligned to DoD Zero Trust Architecture.
Apply secure system design tools, automated analysis methods, and architectural frameworks to build resilient, least-privilege, continuously monitored environments.
Integrate Zero Trust Data Pillar capabilities-data labeling, tagging, classification, encryption at rest/in transit, access policy definition, monitoring, and auditing.
Analyze and interpret data from multiple structured and unstructured sources to support decision-making and identify anomalies or vulnerabilities.
Assess cybersecurity principles, threats, and vulnerabilities impacting enterprise data systems, including risks such as corruption, exfiltration, and denial-of-service.
Support systems engineering activities, ensuring secure integration of technologies and alignment with Zero Trust operational objectives.
Design and maintain secure network architectures that balance security controls, mission requirements, and operational tradeoffs.
Generate queries, algorithms, and reports to evaluate data structures, identify patterns, and improve system integrity and performance.
Ensure compliance with organizational cybersecurity requirements, particularly confidentiality, integrity, availability, authentication, and non-repudiation.
Evaluate impacts of cybersecurity lapses and implement safeguards to protect mission-critical data systems.
Structure, format, and present data effectively across tools, dashboards, and reporting platforms.
Maintain knowledge of enterprise information security architecture and database systems to support secure data flow and system design.
Requirements
Active TS/SCI security clearance (required).
Deep knowledge of Zero Trust principles (never trust, always verify; explicit authentication; least privilege; continuous monitoring).
Experience with program protection planning, IT supply chain risk management, and anti-tampering techniques.
Strong understanding of cybersecurity principles, CIA triad requirements, and data-focused threats (corruption, exfiltration, denial-of-service).
Proficiency in secure system design, automated systems analysis tools, and systems engineering processes.
Ability to work with structured and unstructured data, including developing queries, algorithms, and analytical reports.
Knowledge of database systems, enterprise information security architecture, and data structuring/presentation techniques.
Understanding of network design processes, security tradeoffs, and enterprise architecture integration.
Strong ability to interpret data from multiple tools to support security decision-making.
Familiarity with impacts of cybersecurity lapses on data systems and operational environments.
Kavaliro is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other characteristic protected by law.
Senior Data Engineer
Indianapolis, IN jobs
Pinnacle Partners is assisting our client in the search for a Senior Data Engineer to join their team in the Indianapolis, IN area. This successful resource will be responsible for supporting the large-scale data modernization initiative and operationalize the platform moving forward.
RESPONSIBILITIES:
Design, develop, and refine BI focused data architecture and data platforms
Work with internal teams to gather requirements and translate business needs into technical solutions
Build and maintain data pipelines supporting transformation
Develop technical designs, data models, and roadmaps
Troubleshoot and resolve data quality and processing issues
Create and maintain detailed documentation for data warehouses, data stores, and end-to-end data flows
Mentor and support junior team members
REQUIREMENTS:
5+ years of hands-on experience with data warehousing, databases, and dimensional data modeling
5+ years of experience across end-to-end data analysis and development
Experience using GIT version control
Advanced SQL skills
Strong experience with AWS cloud
PREFERRED SKILLS:
Experience with Snowflake
Experience with Python or R
Bachelor's degree in an IT-Related field
TERMS:
This is a direct hire opportunity with a salary up to $130K based on experience. They offer benefits including medical, dental, and vision along with generous PTO, 401K matching, wellness programs, and other benefits.
DevOps Engineer
Windsor, CT jobs
We are looking for a DevOps Engineer for our client with offices in Windsor, CT. This is a contract-to-hire opportunity and a hybrid schedule.
NO 3rd PARTIES, NO SPONSORSHIP, NO EXCEPTIONS. Candidates MUST be authorized to work in the United States without sponsorship.
The DevOps Engineer is responsible for maintaining and monitoring cloud infrastructure, as well as the processes necessary to support the tools and infrastructure used by our client. The DevOps Eng. will work with the other DevOps team members to enhance, improve and extend the cloud infrastructure. The DevOps Eng. is a blend of diligent and pragmatic operators and software craftspeople that apply sound IT principles, operational discipline, mature automation and IT security to our client's cloud environments.
Education/Experience Required:
• Bachelor's Degree, or education and training normally associated with a Bachelor's Degree in Computer Science or related field.
• Strong knowledge of Linux and scripting languages (bash, awk, sed, yum).
• Good knowledge of Python.
• Containerization and orchestration tools like Docker Swarm and Kubernetes.
• Good understanding of AWS Cloud Services and technologies: JVM, Apache Tomcat, IT Automation tools (e.g. Ansible and Jenkins).
Senior DevOps Engineer
Englewood, CO jobs
Client located in Englewood, Colorado is seeking a Senior DevOps Engineer for a contract to hire position. This person will join a five-person team for DevOps in a mixed AWS / Azure environment.
HYBRID SCHEDULE: This role requires onsite work 3 days per week. Only local candidates will be considered.
Required:
-5+ years DevOps experience.
-Strong containerization skills (Docker, Kubernetes).
-Strong automation skills (PowerShell, Python, Bash).
-Terraform expertise.
-Leadership mentality - willing to mentor and help team with implementing best practices.
Desired:
-Certifications (DevOps Engineer, Terraform Associate)
-Computer Science degree.
Azure Data Engineer
Jersey City, NJ jobs
Title: Senior Azure Data Engineer Client: Major Japanese Bank Experience Level: Senior (10+ Years)
The Senior Azure Data Engineer will design, build, and optimize enterprise data solutions within Microsoft Azure for a major Japanese bank. This role focuses on architecting scalable data pipelines, enhancing data lake environments, and ensuring security, compliance, and data governance best practices.
Key Responsibilities:
Develop, maintain, and optimize Azure-based data pipelines and ETL/ELT workflows.
Design and implement Azure Data Lake, Synapse, Databricks, and ADF solutions.
Ensure data security, compliance, lineage, and governance controls.
Partner with architecture, data governance, and business teams to deliver high-quality data solutions.
Troubleshoot performance issues and improve system efficiency.
Required Skills:
10+ years of data engineering experience.
Strong hands-on expertise with Azure Synapse, Azure Data Factory, Azure Databricks, Azure Data Lake, and Azure SQL.
Azure certifications strongly preferred.
Strong SQL, Python, and cloud data architecture skills.
Experience in financial services or large enterprise environments preferred.
DevOps Engineer
McLean, VA jobs
The candidate should be able to drive implementation and improvement of tools and technologies for enterprise adoption in accordance with operational and security standards.
Practice and promote a Site Reliability Engineering (SRE) culture to improve and operate cloud platform offerings to the
enterprise while working toward innovation, automation, and operational excellence.
Automation experience is a must for this position.
Ability to provide 24x7 operational support on a periodic basis and involvement in Issue resolution is a must.
Must Have Qualifications:
Must have 5+ years of have on experience with AWS CloudFormation and Terraform. Automation through Shell Scripting and Python required (Ansible nice to have). 3+ years of experience with EKS and Kubernetes
Technical expertise:
7+ years of overall information technology experience with an emphasis on integration and delivery of virtual/cloud platforms to enterprise applications.
At least 5 years of proven experience with AWS CloudFormation, Terraform, or similar tools.
3+ years of experience with engineering and supporting containerization technology (OpenShift, Kubernetes, AWS(ECS/EKS), etc.) at scale.
Experience in Python, Ansible and shell scripting to automate routine operation tasks.
Experience in Tetrate, Rancher, ArgoCD are highly preferred.
About US Tech Solutions:
US Tech Solutions is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit ***********************
US Tech Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Recruiter Details:
Aishwarya Chandra
Email: ****************************************
Job ID: 25-53450