Senior D365 F&O Support Engineer (455001)
Remote applications support engineer job
Senior D365 F&O Support Engineer | 455001 DETAILS 6M C2H Hourly / Salary: to $140K+ Vaco Technology is currently seeking a Senior D365 F&O Support Engineer for a 6M C2H that is 100% remote. The Senior D365 F&O Support Engineer will serve as a Tier-3 escalation resource, owning complex technical investigations and delivering permanent resolutions for high-impact production incidents across large-scale, mission-critical D365 F&O environments. The Senior D365 F&O Support Engineer will join an enterprise managed services practice and will critically focus on providing advanced production support, incident resolution, and proactive stability improvements for global D365 F&O Implementations.
Incident Management (P1 / P2) - Lead diagnosis / resolution of critical production incidents in D365 F&O | Coordinate cross-team response during outages and major disruptions
Root Cause Analysis - Perform deep-dive investigations into performance degradation / batch failures / integration errors / data inconsistencies / custom extension defects
Advanced Troubleshooting Tools - X++ Debugging / Trace Parser / SQL Server Profiler / LCS Environment Monitoring / Application Insights / Azure Diagnostics
Hotfix / Deployment Engineering - Develop / Test / Deploy Hotfixes and Deployable Packages to Production / Sandbox via LCS and Azure DevOps
Performance Optimization - Query Tuning / Index Management / Batch Framework Optimization / Application-Level Performance Improvements
Escalation Management - Collaborate Directly with Microsoft Premier Support / CSS on Escalated Cases | Serve as Primary Technical POC
Cross-Functional Collaboration - Partner with Functional Teams / Infrastructure Engineering / Client IT Stakeholders During Incidents / PIR Reviews
Post-Incident Reporting - Author detailed post-mortem reports | Develop and implement preventive controls to reduce recurrence
Monitoring / Runbooks - Build / maintain monitoring, alerting, operational runbooks, and support documents
Technical Leadership - Provide guidance / mentorship to mid-level engineers | Support process maturation
On-Call Rotation - Participate in structured on-call rotation with premium compensation
About the Project: Our client (MSP) has just won a flagship, enterprise-wide support contract with a Fortune 200 client that has fully replaced its direct Microsoft support with their services. To deliver immediate white-glove support, they are building a dedicated 6-person Microsoft Engineering Team, including 2 D365 F&O Administrators, 2 Modern Workplace / M365 Generalists, 1 Power Platform Engineer, and 1 Azure Engineer. These are all is a high-visibility roles that demands strong technical depth, exceptional customer-facing communication, composure under pressure, and the ability to multitask across high-volume tickets.
JOB REQUIREMENTS
Senior D365 F&O Support Engineer - Lead Diagnosis / Resolution of Critical Production Incidents | Coordinate Cross-Team Response for Outages / Major Disruptions
D365 F&O - D365 F&O Architecture (expert knowledge) / Metadata Models / Batch Framework / Data Entities / Security Configuration
X++ Engineering (advanced) - Hands-on X++ Development / Extensions Framework / Event Handlers / Chain-of-Command / Customizations / Refactoring
SQL / Performance - SQL Server Performance T / Query Optimization / Azure SQL Troubleshooting
Production Support Expertise - Resolving Complex P1 / P2 Incidents in High-Volume D365 F&O Deployments (calm / methodical response under pressure)
Diagnostics / Monitoring Tools - LCS diagnostics / Trace Parser / PerfTimer / X++ Profiler / Azure Monitoring and Telemetry Tools
Azure Integration - Logic Apps / Service Bus / Azure Functions / DataFactory
ALM / DevOps - Solution Build / Deployment using Azure DevOps Pipelines / Branching / Code Reviews / Automated Builds
PREFERRED (not required)
Certifications - Strong preference for MB-500 / MB-700 / MB-920, etc.
MS Partner Manager Services Organizations / Enterprise-Level D365 F&O Support Team
Enterprise Environment - Support for Global / High-Transaction-Volume D365 F&O Implementations | Complex Multi-Legal-Entity Architecture
Service Management - ITIL Practices / Incident / Problem / Change Management Frameworks (familiarity)
Determining compensation for this role (and others) at Vaco/Highspring depends upon a wide array of factors including but not limited to the individual's skill sets, experience and training, licensure and certifications, office location and other geographic considerations, as well as other business and organizational needs. With that said, as required by local law in geographies that require salary range disclosure, Vaco/Highspring notes the salary range for the role is noted in this job posting. The individual may also be eligible for discretionary bonuses, and can participate in medical, dental, and vision benefits as well as the company's 401(k) retirement plan. Additional disclaimer: Unless otherwise noted in the job description, the position Vaco/Highspring is filing for is occupied. Please note, however, that Vaco/Highspring is regularly asked to provide talent to other organizations. By submitting to this position, you are agreeing to be included in our talent pool for future hiring for similarly qualified positions. Submissions to this position are subject to the use of AI to perform preliminary candidate screenings, focused on ensuring minimum job requirements noted in the position are satisfied. Further assessment of candidates beyond this initial phase within Vaco/Highspring will be otherwise assessed by recruiters and hiring managers. Vaco/Highspring does not have knowledge of the tools used by its clients in making final hiring decisions and cannot opine on their use of AI products.
Infrastructure Engineer
Remote applications support engineer job
We are excited that you've taken the time to explore our business and potentially join us on this incredible journey. We are already the leader in the Insider Risk Management, but our story doesn't stop there. We have serious growth plans and that means serious growth opportunities for everyone in our team, whether you're looking to develop into a management position or establish yourself as an industry expert - we are here to support you.
DTEX Systems helps hundreds of organizations worldwide better understand their workforce, protect their data, and make human-centric operational investments. At DTEX, our philosophy towards our business is the same as our philosophy towards technology: people come first. Our future depends on bright, energetic, talented people who share a passion for building the next generation of user behavior intelligence. We invite you to bring your talent to one of our offices and help create our future, expanding our reach and influence worldwide. Learn more about DTEX Systems mission to make businesses more secure through technology at ******************** LinkedIn DTEX Systems: Overview | LinkedIn
Why you should choose DTEX as your next career:
Opportunity to be part of a disruptive high growth success story.
DTEX is a great place to work because of its mission-oriented culture and passion for protecting customers.
We offer exciting growth opportunities and an excellent platform for individuals to contribute to thought leadership as experts in their field.
We are uniquely positioned to solve highly relevant and complex risks and challenges associated with insider risk.
Opportunity to be part of a business that's passionate about creating first-of-a-kind solutions.
Best in class benefits
What is the Role:
DTEX is seeking an experienced Infrastructure Engineer with a strong software engineering background to help drive modernization of our infrastructure and operations. This is a high-impact role where you will design and implement automation solutions to manage customer environments and enable the business to scale beyond what manual operations allow. You will be instrumental in our efforts to transition from legacy operations to modern, automated Infrastructure as Code best practices, applying software engineering principles to solve complex operational problems and build resilient systems.
What You Will Do:
Design, write, and maintain software, primarily in Python, to automate the provisioning, deployment, and configuration management of our infrastructure
Contribute to the adoption and maturation of Terraform, establishing and maintaining best practices for state management, modularization, and version control.
Utilize Ansible and/or Saltstack to ensure consistency, repeatability, and standardization across all environments.
Develop robust CI/CD pipelines for both infrastructure and application deployments, replacing manual processes.
Implement and mature monitoring, logging, and alerting systems to proactively improve system reliability.
Participate in a “follow the sun” on-call rotation, focusing on sustainable incident response, blameless postmortems, and driving continuous improvement.
Champion SRE principles, automation, and coding best practices within the team and across the organization.
We are looking for you if you have:
Essential Technical Skills:
3+ years of hands-on experience managing production environments in AWS and/or GCP.
Strong proficiency in Python. Demonstrated ability to write clean, maintainable, and testable code to solve infrastructure problems.
Experience with Terraform, including best practices for state management and modular design in complex environments.
Strong knowledge of Linux internals and high competency in Bash scripting and command-line operations.
Proficiency with Ansible and/or Saltstack as configuration management tools.
Expert level understanding of Git and collaborative workflows, such as branching strategies and code review best practices.
Highly Desirable Skills:
Proven track record of transitioning legacy/manual operations environments to automated, IaC-driven approaches.
Experience with containerization in the context of Docker or Kubernetes, and how container orchestration is used in modern systems.
Experience building and managing CI/CD pipelines for infrastructure automation.
Familiarity with Zabbix, Prometheus, Grafana and other tools.
Experience operating and querying Opensearch/Elasticsearch.
...and these other things:
A strong desire to solve complex problems, the resilience to work through significant technical debt, and enthusiasm for driving cultural and technical change.
A desire to work in enterprise and government focused computing environments with robust security and reliability requirements.
MS/BS in Computer Science/Computer Engineering or related field of study (or equivalent experience)
Must meet all personnel screening requirements as specified by applicable federal contracts or agency regulations, which may include US citizenship.
This position is open to U.S. based candidates only. Unfortunately, we are unable to provide work visa sponsorship at this time.
We take good care of our people. Our benefits include:
Comprehensive health, vision, and dental coverage
Flexible time off
Company computer hardware of your choice
Work from home setup reimbursement
Wellness perks including access to mental healthcare, gym discounts and personal care concierge
Virtual events, happy hours, trivia, and fun
Monthly Internet & Phone Reimbursement
Opportunities to learn and grow
DTEX Systems is one of the most trusted and innovative brands in the cyber security market. We have received significant financial backing from leading VC firms and have just set a record-breaking year of growth. So why not trust DTEX with that all important next step in your career?
DTEX Systems is proud to provide equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, gender, religion, sex, national origin, age, disability, or genetics.
Exact compensation may vary based on skills, experience, and location.
Base salary range (SF Bay Area): $150k-$190k.
Senior Data Analytics Engineer
Applications support engineer job in Columbus, OH
We are seeking a highly skilled Analytics Data Engineer with deep expertise in building scalable data solutions on the AWS platform. The ideal candidate is a 10/10 expert in Python and PySpark, with strong working knowledge of SQL. This engineer will play a critical role in translating business and end-user needs into robust analytics products-spanning ingestion, transformation, curation, and enablement for downstream reporting and visualization.
You will work closely with both business stakeholders and IT teams to design, develop, and deploy advanced data pipelines and analytical capabilities that power enterprise decision-making.
Key Responsibilities
Data Engineering & Pipeline Development
Design, develop, and optimize scalable data ingestion pipelines using Python, PySpark, and AWS native services.
Build end-to-end solutions to move large-scale big data from source systems into AWS environments (e.g., S3, Redshift, DynamoDB, RDS).
Develop and maintain robust data transformation and curation processes to support analytics, dashboards, and business intelligence tools.
Implement best practices for data quality, validation, auditing, and error-handling within pipelines.
Analytics Solution Design
Collaborate with business users to understand analytical needs and translate them into technical specifications, data models, and solution architectures.
Build curated datasets optimized for reporting, visualization, machine learning, and self-service analytics.
Contribute to solution design for analytics products leveraging AWS services such as AWS Glue, Lambda, EMR, Athena, Step Functions, Redshift, Kinesis, Lake Formation, etc.
Cross-Functional Collaboration
Work with IT and business partners to define requirements, architecture, and KPIs for analytical solutions.
Participate in Daily Scrum meetings, code reviews, and architecture discussions to ensure alignment with enterprise data strategy and coding standards.
Provide mentorship and guidance to junior engineers and analysts as needed.
Engineering (Supporting Skills)
Employ strong skills in Python, Pyspark and SQL to support data engineering tasks, broader system integration requirements, and application layer needs.
Implement scripts, utilities, and micro-services as needed to support analytics workloads.
Required Qualifications
5+ years of professional experience in data engineering, analytics engineering, or full-stack data development roles.
Expert-level proficiency (10/10) in:
Python
PySpark
Strong working knowledge of:
SQL and other programming languages
Demonstrated experience designing and delivering big-data ingestion and transformation solutions through AWS.
Hands-on experience with AWS services such as Glue, EMR, Lambda, Redshift, S3, Kinesis, CloudFormation, IAM, etc.
Strong understanding of data warehousing, ETL/ELT, distributed computing, and data modeling.
Ability to partner effectively with business stakeholders and translate requirements into technical solutions.
Strong problem-solving skills and the ability to work independently in a fast-paced environment.
Preferred Qualifications
Experience with BI/Visualization tools such as Tableau
Experience building CI/CD pipelines for data products (e.g., Jenkins, GitHub Actions).
Familiarity with machine learning workflows or MLOps frameworks.
Knowledge of metadata management, data governance, and data lineage tools.
Software Support Engineer
Applications support engineer job in Dublin, OH
As the Software Support Engineer, you will be the first line of support to troubleshoot and resolve customer reported technical and operational issues. This position is a unique blend of Software Development, Quality Assurance Testing, on-site implementation and installation, and support. Bachelor's degree in a related technical field is required, as well as a passion for software development and testing.
MINIMUM QUALIFICATIONS:
• Bachelor's degree in a related field (Computer Science, Computer Information Systems, Software Engineering, etc.)
• Experience designing software applications and a strong understanding of the Software Development Life cycle
• Experience troubleshooting and problem solving
• Customer service-friendly personality
• Prefer familiarity with relational databases (MySQL preferred)
• Prefer experience with Python and C development
• Prefer experience in Linux environment
• Prior technical support experience is a plus
RESPONSIBILITIES:
• Software Testing and onsite implementations/installation
• Provide customer support to help assist with operational and system issues
• Design, maintain and modify existing applications
• Take responsibility for multiple tasks in multiple projects simultaneously
• Review existing code and make required modifications
• Submit all designs and code modifications for peer review
• Write, maintain and execute test plans
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
Senior Data Engineer.
Applications support engineer job in Columbus, OH
Immediate need for a talented Senior Data Engineer. This is a 06+ months contract opportunity with long-term potential and is located in Columbus, OH(Remote). Please review the job description below and contact me ASAP if you are interested.
Job ID: 25-95277
Pay Range: $70 - $71 /hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
Working with Marketing data partners and build data pipelines to automate the data feeds from the partners to internal systems on Snowflake.
Working with Data Analysts to understand their data needs and prepare the datasets for analytics.
Work with Data Scientists to build the infrastructure to deploy the models, monitor the performance, and build the necessary audit infrastructure.
Key Requirements and Technology Experience:
Key skills; Snowflake, Python and AWS
Experience with building data pipelines, data pipeline infrastructure and related tools and environments used in analytics and data science (ex: Python, Unix)
Experience in developing analytic workloads with AWS Services, S3, Simple Queue Service (SQS), Simple Notification Service (SNS), Lambda, EC2, ECR and Secrets Manager.
Strong proficiency in Python, SQL, Linux/Unix shell scripting, GitHub Actions or Docker, Terraform or CloudFormation, and Snowflake.
Order of Importance: Terraform, Docker, GitHub Actions OR Jenkins
Experience with orchestration tools such as Prefect, DBT, or Airflow.
Experience automating data ingestion, processing, and reporting/monitoring.
Experience with other relevant tools used in data engineering (e.g., SQL, GIT, etc.)
Ability to set up environments (Dev, QA, and Prod) using GitHub repo and GitHub rules/methodologies; how to maintain (via SQL coding and proper versioning)
Our client is a leading Insurance Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, colour, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.
By applying to our jobs, you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Senior Data Engineer
Applications support engineer job in Columbus, OH
Responsible for understanding, preparing, processing, and analyzing data to make it valuable and useful for operations decision support.
Accountabilities in this role include:
Partnering with Business Analysis and Analytics teams.
Demonstrating problem-solving ability for effective and timely resolution of system issues, including production outages.
Developing and supporting standard processes to harvest data from various sources and perform data blending to develop advanced data sets, analytical cubes, and data exploration.
Utilizing queries, data exploration and transformation, and basic statistical methods.
Creating Python scripts.
Developing Microsoft SQL Server Integration Services Workflows.
Building Microsoft SQL Server Analysis Services Tabular Models.
Focusing on SQL database work with a blend of strong technical and communication skills.
Demonstrating ability to learn and navigate in large complex environments.
Exhibiting Excel acumen to develop complex spreadsheets, formulas, create macros, and understand VBA code within the modules.
Required Skills:
Experience with MS SQL
Proficiency in Python
Desired Skills:
Experience with SharePoint
Advanced Excel Skills (formulas, VBA, Power Pivot, Pivot Table)
Java Software Engineer
Applications support engineer job in Columbus, OH
Title: Java Software Engineer
Hire Type: 12 month contract to start (potential extensions and full time hire)
Pay Range: $50/hr - $65/hr (contingent on years of experience, skills, and education)
Required Skills & Experience
Strong programming skills within Java
Jenkins experience for automating builds, CI/CD, and pipeline orchestration
experience withing in AWS environment with some exposure to cloud development
experience with event driven architecture
Job Description
Insight Global is looking for a Java Software Engineer to sit in Columbus, Ohio. This candidate will be aligned to a platform automation project within their internal ERP system. Automation efforts will be assigned to internal developers, and this resource will be working within the middle tier of their internal system. The current code is written in .NET framework, but the new code being developed will be Java based. Candidates will be working with various teams and specifically aligned to their Billing Portal within the internal system focusing on the code for transitions in the middle tier to the customer/client facing tier and back office functions. Candidates need to have worked in an AWS environment and have some exposure to event driven architecture (General structure).
Data Engineer- ETL/ELT - Hybrid/Remote
Remote applications support engineer job
Crown Equipment Corporation is a leading innovator in world-class forklift and material handling equipment and technology. As one of the world's largest lift truck manufacturers, we are committed to providing the customer with the safest, most efficient and ergonomic lift truck possible to lower their total cost of ownership.
Indefinite US Work Authorization Required.
Primary Responsibilities
Design, build and optimize scalable data pipelines and stores.
Clean, prepare and optimize data for consumption in applications and analytics platforms.
Participate in peer code reviews to uphold internal standards.
Ensure procedures are thoroughly tested before release.
Write unit tests and record test results.
Detect, define and debug programs whenever problems arise.
Provide training to users and knowledge transfer to support personnel and other staff members as required.
Prepare system and programming documentation in accordance with internal standards.
Interface with users to extract functional needs and determine requirements.
Conduct detailed systems analysis to define scope and objectives and design solutions.
Work with Business Analyst to help develop and write system requirements.
Establish project plans and schedules and monitor progress providing status reports as required.
Qualifications
Bachelor's degree in Computer Science, Software/Computer Engineering, Information Systems, or related field is required.
4+ years' experience in SQL, ETL, ELT and SAP Data is required.
Python, Databricks, Snowflakes experience preferred.
Strong written, verbal, analytical and interpersonal skills are necessary.
Remote Work: Crown offers hybrid remote work for this position. A reasonable commute is necessary as some onsite work is required. Relocation assistance is available.
Work Authorization:
Crown will only employ those who are legally authorized to work in the United States. This is not a position for which sponsorship will be provided. Individuals with temporary visas or who need sponsorship for work authorization now or in the future, are not eligible for hire.
No agency calls please.
Compensation and Benefits:
Crown offers an excellent wage and benefits package for full-time employees including Health/Dental/Vision/Prescription Drug Plan, Flexible Benefits Plan, 401K Retirement Savings Plan, Life and Disability Benefits, Paid Parental Leave, Paid Holidays, Paid Vacation, Tuition Reimbursement, and much more.
EOE Veterans/Disabilities
NextGen Applications Analyst
Remote applications support engineer job
NOTE: This role is NOT open to C2C companies
NextGen Applications Analyst - Regulatory Upgrade
Multiple Sites (Remote with Limited Travel)
Start: Mid/Late August | Orientation/Training ~30 days
Duration: Through 2027
About the Role
We're seeking experienced Applications Analysts (Tier 1 Apps Advisors) to support large and complex NextGen 8 regulatory upgrade rollouts nationwide.
Tier 1 analysts will handle large/jumbo clients and complex environments, while Tier 2 specialists will support smaller or mid-sized client projects. This is an opportunity to work on high-impact initiatives that modernize clinical workflows and enhance EHR usability across the country.
Key Responsibilities
Support the planning, configuration, and deployment of NextGen 8 regulatory upgrades.
Customize and optimize Adaptive Content Engine (ACE) templates to align with clinical documentation needs.
Collaborate with cross-functional technical and clinical teams to ensure smooth implementation.
Troubleshoot and resolve upgrade-related application issues.
Ensure compliance with regulatory, security, and infrastructure standards.
Contribute to readiness calls and go-live support, occasionally on weekends.
Required Experience
Hands-on experience with NextGen 8, including:
UI enhancements and navigation redesigns
Adaptive Content Engine (ACE) template configuration
APSO documentation workflows
Understanding of NextGen 8 infrastructure requirements and environment setup.
Experience supporting migrations of healthcare applications to AWS or similar environments.
Strong problem-solving, communication, and collaboration skills.
Travel Expectations
Travel requirements vary by client - some prefer fully remote support, while others may request onsite presence.
Weekend work may occasionally be needed (usually readiness calls; not always full 8-hour shifts).
If weekend hours are worked, a weekday off will be given to maintain a two-day weekend.
Data Engineer
Applications support engineer job in Columbus, OH
We're seeking a skilled Data Engineer based in Columbus, OH, to support a high-impact data initiative. The ideal candidate will have hands-on experience with Python, Databricks, SQL, and version control systems, and be comfortable building and maintaining robust, scalable data solutions.
Key Responsibilities
Design, implement, and optimize data pipelines and workflows within Databricks.
Develop and maintain data models and SQL queries for efficient ETL processes.
Partner with cross-functional teams to define data requirements and deliver business-ready solutions.
Use version control systems to manage code and ensure collaborative development practices.
Validate and maintain data quality, accuracy, and integrity through testing and monitoring.
Required Skills
Proficiency in Python for data engineering and automation.
Strong, practical experience with Databricks and distributed data processing.
Advanced SQL skills for data manipulation and analysis.
Experience with Git or similar version control tools.
Strong analytical mindset and attention to detail.
Preferred Qualifications
Experience with cloud platforms (AWS, Azure, or GCP).
Familiarity with enterprise data lake architectures and best practices.
Excellent communication skills and the ability to work independently or in team environments.
Software Engineer
Applications support engineer job in Columbus, OH
hackajob has partnered with a global technology and management consultancy, specializing in driving transformation across the financial services and energy industries, and we're looking for Java & Python Developers!
Role: Software Engineer (Java & Python)
Mission: This role focuses on a large technology implementation with a major transition of a broker/dealer platform. These resources will support ETL development, API development, and conversion planning.
Location: On-site role in Columbus, OH.
Rates:
W2 - $32 per hour
1099 - $42 per hour
Work authorization: This role requires you to be authorized to work in the United States without sponsorship.
Qualifications (+4 years of experience):
Strong experience with Java, Spring Boot, and microservices architecture.
Proficiency in Python for ETL and automation.
Hands-on experience with API development.
Knowledge of data integration, ETL tools, and conversion workflows.
hackajob is a recruitment platform that matches you with relevant roles based on your preferences. To be matched with the roles, you need to create an account with us.
This role requires you to be based in the US.
Senior Data Engineer(only W2)
Applications support engineer job in Columbus, OH
Bachelor's Degree in Computer Science or related technical field AND 5+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, or Java.
Proficiency with Azure data services, such as Azure Data Lake, Azure Data Factory and Databricks.
Expertise using Cloud Security (i.e., Active Directory, network security groups, and encryption services).
Proficient in Python for developing and maintaining data solutions.
Experience with optimizing or managing technology costs.
Ability to build and maintain a data architecture supporting both real-time and batch processing.
Ability to implement industry standard programming techniques by mastering advanced fundamental concepts, practices, and procedures, and having the ability to analyze and solve problems in existing systems.
Expertise with unit testing, integration testing and performance/stress testing.
Database management skills and understanding of legacy and contemporary data modeling and system architecture.
Demonstrated leadership skills, team spirit, and the ability to work cooperatively and creatively across an organization
Experience on teams leveraging Lean or Agile frameworks.
Data Engineer
Applications support engineer job in Dublin, OH
The Data Engineer is a technical leader and hands-on developer responsible for designing, building, and optimizing data pipelines and infrastructure to support analytics and reporting. This role will serve as the lead developer on strategic data initiatives, ensuring scalable, high-performance solutions are delivered effectively and efficiently.
The ideal candidate is self-directed, thrives in a fast-paced project environment, and is comfortable making technical decisions and architectural recommendations. The ideal candidate has prior experience in modern data platforms, most notable Databricks and the “lakehouse” architecture. They will work closely with cross-functional teams, including business stakeholders, data analysts, and engineering teams, to develop data solutions that align with enterprise strategies and business goals.
Experience in the financial industry is a plus, particularly in designing secure and compliant data solutions.
Responsibilities:
Design, build, and maintain scalable ETL/ELT pipelines for structured and unstructured data.
Optimize data storage, retrieval, and processing for performance, security, and cost-efficiency.
Ensure data integrity and governance by implementing robust validation, monitoring, and compliance processes.
Consume and analyze data from the data pipeline to infer, predict and recommend actionable insight, which will inform operational and strategic decision making to produce better results.
Empower departments and internal consumers with metrics and business intelligence to operate and direct our business, better serving our end customers.
Determine technical and behavioral requirements, identify strategies as solutions, and section solutions based on resource constraints.
Work with the business, process owners, and IT team members to design solutions for data and advanced analytics solutions.
Perform data modeling and prepare data in databases for analysis and reporting through various analytics tools.
Play a technical specialist role in championing data as a corporate asset.
Provide technical expertise in collaborating with project and other IT teams, internal and external to the company.
Contribute to and maintain system data standards.
Research and recommend innovative, and where possible automated approaches for system data administration tasks. Identify approaches that leverage our resources and provide economies of scale.
Engineer system that balances and meets performance, scalability, recoverability (including backup design), maintainability, security, high availability requirements and objectives.
Skills:
Databricks and related - SQL, Python, PySpark, Delta Live Tables, Data pipelines, AWS S3 object storage, Parquet/Columnar file formats, AWS Glue.
Systems Analysis - The application of systems analysis techniques and procedures, including consulting with users, to determine hardware, software, platform, or system functional specifications.
Time Management - Managing one's own time and the time of others.
Active Listening - Giving full attention to what other people are saying, taking time to understand the points being made, asking questions as appropriate, and not interrupting at inappropriate times.
Critical Thinking - Using logic and reasoning to identify the strengths and weaknesses of alternative solutions, conclusions or approaches to problems.
Active Learning - Understanding the implications of new information for both current and future problem-solving and decision-making.
Writing - Communicating effectively in writing as appropriate for the needs of the audience.
Speaking - Talking to others to convey information effectively.
Instructing - Teaching others how to do something.
Service Orientation - Actively looking for ways to help people.
Complex Problem Solving - Identifying complex problems and reviewing related information to develop and evaluate options and implement solutions.
Troubleshooting - Determining causes of operating errors and deciding what to do about it.
Judgment and Decision Making - Considering the relative costs and benefits of potential actions to choose the most appropriate one.
Experience and Education:
High School Diploma (or GED or High School Equivalence Certificate).
Associate degree or equivalent training and certification.
5+ years of experience in data engineering including SQL, data warehousing, cloud-based data platforms.
Databricks experience.
2+ years Project Lead or Supervisory experience preferred.
Must be legally authorized to work in the United States. We are unable to sponsor or take over sponsorship at this time.
Software Engineer
Remote applications support engineer job
Front leaning Full stack Software Engineer role (React, Typescript, Node.js, AWS, data at scale)
100% Remote
Compensation: $170K-$200K + 10% bonus
Full-time W-2 Employment with medical benefits
Client: Late stage (10 years old) Adtech startup - 300+ employees, 65 Engineers
Core Qualifications
Minimum of 10 years experience as a Software Engineer
Must have exposure around Object Oriented Design, Analysis, and Programming in multiple of the following languages: JavaScript, TypeScript, Python, NodeJS, AngularJS, React/React Native, & Vue; as well as knowledge around: API, ORM, Cloud (AWS), SOA, SaaS, messaging, stream processing, and SQL data store technologies.
Must be able to evaluate and modify complex database stored procedures, database structures, and have familiarity with containerization and scaling of SaaS platform services.
Must be able to deep-dive into various applications and data stores to produce meaningful insights, profiling and tracing, operational intelligence, customer experience visualizations, and proactive trend analyses.
Can quickly consume and understand business strategy and operating models; can apply gap analysis techniques to create long-term technical product strategy.
Can ensure technical product and social capabilities match business needs and goals.
Can effectively communicate goals, metrics, and value propositions across the Engineering Organization.
Can facilitate design, development, and support of existing and new products between cross-functional business stakeholders.
Assist team members with problem-solving complex use cases and systems; while leading technical change and transformation in parallel.
Must have knowledge around application system services, communication protocols, and standard industry technologies.
Must be passionate about creating solutions, and solving problems - in the right way, at the right time, and for the right reasons.
Must be teachable, give and receive feedback, and demonstrate success in their discipline on a consistent and transparent basis.
Education
Minimum of 10 years of experience in a product, engineering, development, or technical delivery position.
Bachelor of Science Degree in Computer Science or similar
Data Engineer
Remote applications support engineer job
We are looking for a Data Engineer in Austin, TX (fully remote - MUST work CST hours).
Job Title: Data Engineer
Contract: 12 Months
Hourly Rate: $75- $82 per hour (only on W2)
Additional Notes:
Fully remote - MUST work CST hours
SQL, Python, DBT, Utilize geospatial data tools (PostGIS, ArcGIS/ArcPy, QGIS, GeoPandas, etc.) to optimize and normalize spatial data storage, run spatial queries and processes to power analysis and data products
Design, create, refine, and maintain data processes and pipelines used for modeling, analysis, and reporting using SQL (ideally Snowflake and PostgreSQL), Python and pipeline and transformation tools like Airflow and dbt
• Conduct detailed data research on internal and external geospatial data (POI, geocoding, map layers, geometrics shapes), identify changes over time and maintain geospatial data (shape files, polygons and metadata)
• Operationalize data products with detailed documentation, automated data quality checks and change alerts
• Support data access through various sharing platforms, including dashboard tools
• Troubleshoot failures in data processes, pipelines, and products
• Communicate and educate consumers on data access and usage, managing transparency in metric and logic definitions
• Collaborate with other data scientists, analysts, and engineers to build full-service data solutions
• Work with cross-functional business partners and vendors to acquire and transform raw data sources
• Provide frequent updates to the team on progress and status of planned work
About us:
Harvey Nash is a national, full-service talent management firm specializing in technology positions. Our company was founded with a mission to serve as the talent partner of choice for the information technology industry.
Our company vision has led us to incredible growth and success in a relatively short period of time and continues to guide us today. We are committed to operating with the highest possible standards of honesty, integrity, and a passionate commitment to our clients, consultants, and employees.
We are part of Nash Squared Group, a global professional services organization with over forty offices worldwide.
For more information, please visit us at ******************************
Harvey Nash will provide benefits please review: 2025 Benefits -- Corporate
Regards,
Dinesh Soma
Recruiting Lead
Data Engineer (Databricks)
Applications support engineer job in Columbus, OH
ComResource is searching for a highly skilled Data Engineer with a background in SQL and Databricks that can handle the design and construction of scalable management systems, ensure that all data systems meet company requirements, and also research new uses for data acquisition.
Requirements:
Design, construct, install, test and maintain data management systems.
Build high-performance algorithms, predictive models, and prototypes.
Ensure that all systems meet the business/company requirements as well as industry practices.
Integrate up-and-coming data management and software engineering technologies into existing data structures.
Develop set processes for data mining, data modeling, and data production.
Create custom software components and analytics applications.
Research new uses for existing data.
Employ an array of technological languages and tools to connect systems together.
Recommend different ways to constantly improve data reliability and quality.
Qualifications:
5+ years data quality engineering
Experience with Cloud-based systems, preferably Azure
Databricks and SQL Server testing
Experience with ML tools and LLMs
Test automation frameworks
Python and SQL for data quality checks
Data profiling and anomaly detection
Documentation and quality metrics
Healthcare data validation experience preferred
Test automation and quality process development
Plus:
Azure Databricks
Azure Cognitive Services integration
Databricks Foundational model Integration
Claude API implementation a plus
Python and NLP frameworks (spa Cy, Hugging Face, NLTK)
Data Engineer
Remote applications support engineer job
This is a fully remote 12+ month contract position. No C2C or 3rd party candidates will be considered.
Data Engineer (AI & Automation)
We are seeking a Data Engineer with hands-on experience using AI-driven tools to support automation, system integrations, and continuous process improvement across internal business systems. This role will focus on building and maintaining scalable data pipelines, enabling intelligent workflows, and improving data accessibility and reliability.
Key Responsibilities
Design, build, and maintain automated data pipelines and integrations across internal systems
Leverage AI-enabled tools to streamline workflows and drive process improvements
Develop and orchestrate workflows using Apache Airflow and n8n AI
Model, transform, and optimize data in Snowflake and Azure SQL Data Warehouse
Collaborate with business and technical teams to identify automation opportunities
Ensure data quality, reliability, and performance across platforms
Required Qualifications
Experience as a Data Engineer or similar role
Hands-on experience with Apache Airflow and modern workflow orchestration tools
Strong experience with Snowflake and Azure SQL Data Warehouse
Familiarity with AI-driven automation and integration tools (e.g., n8n AI)
Strong SQL skills and experience building scalable data pipelines
Preferred Qualifications
Experience integrating multiple internal business systems
Background in process improvement or operational automation
Experience working in cloud-based data environments (Azure preferred)
Workday Software Engineer
Remote applications support engineer job
Positions: Software Engineer, Workday
Duration: Full time position
Type: Remote work model.
.
A day of this role:
As fully remote, this role works extensively on Workday integration projects . Responsible for designing, developing, configuring, integrating, and maintaining Workday applications and solutions. Collaborates with cross-functional teams to support business needs. Operates independently with minimal supervision.
Must haves:
7+ years of Workday Integration experience.
Understanding of Workday data conversion patterns and tools.
Proficiency in Workday integration tools:
EIB
Connectors
Workday Studio
Familiarity with Workday Business Process Framework.
Experience with Workday modules: HCM, Benefits, Time Tracking, Payroll and Security
Workday certifications.
Working knowledge of:
Workday Extend
Workday Report Writer
Calculated fields
Prism Analytics
RaaS (Reports as a Service)
Strong understanding of:
Web technologies
Mobile platforms
APIs (WSDL, SOAP, REST)
SQL
Responsibilities:
Works with constituent departments to fulfill design, application development, configuration, integration, support, and maintenance requests.
Assists in scope definition and estimation of work effort.
Contributes to the business requirements gathering process.
Works with the architecture team to ensure that design standards are followed.
Adheres to defined processes.
Develops application code to fulfill project requests.
Creates technical documentation as required.
Drives incremental improvements to team technical processes and practices.
Mentors development team members in technical complexities of assigned work.
Stays up to date with Workday releases, updates, and new features, and applies this knowledge to improve integration/extend solutions, design and performance.
Qualifications:
Bachelor's degree in computer science, a related field, or four years of related work experience is required.
Three to five years of professional experience is required.
Strong understanding of web, mobile, API, and SQL technologies.
Broad knowledge of software development practices and procedures.
Experience working with Workday modules such as HCM, Benefits, Time Tracking, Payroll and Security.
Good understanding of Workday Business Process Framework.
Good knowledge of Workday integration tools such as EIB, Connectors, Workday Studio.
Working knowledge of Workday Extend.
Working knowledge of Workday Report Writer, calculated fields, Prism.
Working knowledge of Web Services, APIs (WSDL, SOAP, REST) and RaaS.
Knowledge of Workday data conversion patterns and toolset.
Aptitude for continuous learning and improvement.
Strong teamwork skills.
Software Engineer (Remote)
Remote applications support engineer job
Remote (proximity to Chicago, Nashville or Manhattan would be a big plus)
Regular travel is not required but will need to travel to corporate office 2 times a year
Our client is looking to add a Software Developer that will be responsible for designing, developing, and maintaining high-quality software solutions that support the Firm's digital platforms. This role ensures the stability, scalability, and performance of all applications and services, while collaborating with cross-functional teams to drive continuous improvement in development practices and operational efficiency.
Responsibilities
Design and implement stable, scalable, and extensible software solutions.
Ensure adherence to secure software development lifecycle (SDLC) best practices and standards.
Drive the design and development of services and applications to meet defined service level agreements (SLAs).
Work closely with end users and stakeholders to gather requirements and iterate on solutions that deliver business value.
Proactively identify and resolve any obstacles affecting operational efficiency and service continuity.
Provide ongoing support for developed applications and services, ensuring timely issue resolution.
Participate in the Firm's change and incident management processes, adhering to established protocols.
Software Development & Architecture
Develop and maintain features for web-enabled applications using C# .NET Core.
Write clean, scalable code with a focus on maintainability and performance.
Implement robust, efficient SQL-based solutions, preferably using MS SQL.
Develop and maintain user interfaces using modern frameworks, preferably Angular or Blazor.
Ensure solutions are designed with an emphasis on security, efficiency, and optimization.
Contribute to continuous integration and continuous delivery (CI/CD) pipelines, automating processes where possible.
Collaboration & Optimization
Collaborate closely with business analysts, quality assurance, and other developers to ensure solutions meet both functional and non-functional requirements.
Foster a culture of positive, open communication across diverse teams, with a focus on collaboration and shared goals.
Engage in regular reviews and feedback sessions to drive continuous improvement in development processes and practices.
Provide mentorship and guidance to junior developers where appropriate, supporting their professional growth.
Professional Conduct
Demonstrates commitment to the firm's core values, including Accountability, Integrity, Excellence, Grit, and Love.
Ensures all activities align with business objectives and project timelines.
Communicates effectively, openly exchanging ideas and listening with consideration.
Maintains a proactive, solution-oriented mindset when addressing challenges.
Takes ownership of responsibilities and holds others accountable for their contributions.
Continuously seeks opportunities to optimize processes, improve performance, and drive innovation.
Qualifications
1-3+ years of expertise in C# .NET Core development
Competence in SQL, preferably MS SQL
Competence in UI work, preferably Angular and/or Blazor
Strong structured problem-solving skills, with a history of using systematic and fact-based processes to improve mission-critical services.
A focus on optimization and efficiency in processes.
Experience working in a financial services firm would be a big plus
Demonstrated expertise in fostering a culture of positive collaboration among cross-functional teams with diverse personalities, skill sets, and levels of experience.
Highly developed communication skills
A sense of urgency and a bias for action.
For all non-bonus, non-commission direct hire positions: The anticipated salary range for this position is ($95,000 - $120,000). Actual salary will be based on a variety of factors including relevant experience, knowledge, skills and other factors permitted by law. A range of medical, dental, vision, retirement, paid time off, and/or other benefits are available.
Data Entry Product Support - No Experience
Remote applications support engineer job
We're looking for Customer Support Product Testers across the US to work from home and help top brands improve their products before they hit the market.