Post job

Data engineer jobs near me

- 5,211 jobs
jobs
Let us run your job search
Sit back and relax while we apply to 100s of jobs for you - $25
  • Data Engineer

    Harvey Nash

    Remote data engineer job

    We are looking for a Data Engineer in Austin, TX (fully remote - MUST work CST hours). Job Title: Data Engineer Contract: 12 Months Hourly Rate: $75- $82 per hour (only on W2) Additional Notes: Fully remote - MUST work CST hours SQL, Python, DBT, Utilize geospatial data tools (PostGIS, ArcGIS/ArcPy, QGIS, GeoPandas, etc.) to optimize and normalize spatial data storage, run spatial queries and processes to power analysis and data products Design, create, refine, and maintain data processes and pipelines used for modeling, analysis, and reporting using SQL (ideally Snowflake and PostgreSQL), Python and pipeline and transformation tools like Airflow and dbt • Conduct detailed data research on internal and external geospatial data (POI, geocoding, map layers, geometrics shapes), identify changes over time and maintain geospatial data (shape files, polygons and metadata) • Operationalize data products with detailed documentation, automated data quality checks and change alerts • Support data access through various sharing platforms, including dashboard tools • Troubleshoot failures in data processes, pipelines, and products • Communicate and educate consumers on data access and usage, managing transparency in metric and logic definitions • Collaborate with other data scientists, analysts, and engineers to build full-service data solutions • Work with cross-functional business partners and vendors to acquire and transform raw data sources • Provide frequent updates to the team on progress and status of planned work About us: Harvey Nash is a national, full-service talent management firm specializing in technology positions. Our company was founded with a mission to serve as the talent partner of choice for the information technology industry. Our company vision has led us to incredible growth and success in a relatively short period of time and continues to guide us today. We are committed to operating with the highest possible standards of honesty, integrity, and a passionate commitment to our clients, consultants, and employees. We are part of Nash Squared Group, a global professional services organization with over forty offices worldwide. For more information, please visit us at ****************************** Harvey Nash will provide benefits please review: 2025 Benefits -- Corporate Regards, Dinesh Soma Recruiting Lead
    $75-82 hourly 2d ago
  • Data Engineer II [79768]

    Onward Search 4.0company rating

    Remote data engineer job

    Onward Search is partnering with a leading global music streaming company to hire a Data Engineer II (ML Platform Engineer). You'll join the Hendrix ML Platform team, building a robust platform for training and serving ML models at scale. This is a 100% remote contract role (EST hours) with the possibility of extension. What You'll Do Support migration to a new orchestration system while enabling data lineage across training pipelines. Implement lineage features for training, inference, and feature generation. Provide insights into training data usage to meet compliance requirements. Add OpenLineage support for ML endpoints and connect feature lineage to upstream pipelines. Help users transition to the new system, providing technical support and guidance. Communicate progress, risks, and obstacles proactively. What We're Looking For 3+ years building production ML infrastructure with Python or Go. Experience with a public cloud platform (GCP preferred). Strong background in PyTorch, TensorFlow, and Ray (Ray Tune a plus). Understanding of ML pipelines, orchestration, and compliance processes. Familiarity with Flyte, Kubeflow, or Hugging face. Experience with large-scale platform migrations is a plus. Perks & Benefits (via Onward Search) Medical, Dental, and Vision Insurance Life Insurance 401k Program Commuter Benefits eLearning & Education Reimbursement Ongoing Training & Development To qualify for benefits, you must work 30+ hours per week on assignments lasting at least 10 weeks.
    $105k-149k yearly est. 4d ago
  • Junior Data Engineer

    Brooksource 4.1company rating

    Data engineer job in Columbus, OH

    Columbus, OH (Hybrid) Brooksource is looking for an entry-level Data Engineer to join our Banking Client in January 2026. This role will be part of a professional development program where you will receive technical and professional training. If you are interested, please apply! The Data Engineer on the Enterprise Data and Analytics Operations Team will work as part of the larger team in the design, architecture, and optimization of Brooksource's Banking client's data pipelines. This role is an entry level position requiring the individual to be curious and have the aptitude to learn while supporting the team's overall efforts to optimize the data pipelines utilized within the bank. To be successful the candidate must be driven and well-organized, with strong communication skills. It is essential for the individual to be self-motivated and tenacious, thrive in a collaborative, fast-paced environment, while completing tasks on agreed schedules. Primary Responsibilities: Support the design, development, and optimization of data processing pipelines and workflows. Aid in the implementation and management of data solutions using AWS services, (S3, Glue, Athena, and Lambda). Perform data wrangling and exploration to support the needs of the business. Support the development of and maintenance of robust ETL processes. Collaborate with cross-functional teams to understand data requirements and the delivery of effective solutions. Prepare and present design and implementation documentation to multiple stakeholders. Aid in production support efforts. Support the identification of areas where automation can be implemented or improved. Schedule and facilitate meetings as needed. Perform other duties as assigned. Job Requirements: Minimum Requirements: 1 year of experience in software development, automation development role with a bachelor's degree or equivalent transferable experience through coursework, internships, or work experience in lieu of participation in the Elevate Program. Skills: Experience with ETL development. Experience with or exposure to AWS cloud services (e.g., S3, Glue, Athena, or Lambda). Ability to investigate and analyze information and draw conclusions based upon data and facts. Strong technical skills and aptitude with a willingness to learn new languages and technologies. Excellent verbal and written communications. Strong technical writing capability. Ability to translate technology into relatable concepts for business partners and stakeholders. Highly motivated with strong organizational, analytical, decision making, and problem-solving skills. Ability to build strong partnerships and to work collaboratively with all business and IT areas. Ability to effectively handle multiple priorities, prioritize and execute tasks in a high-pressure environment. High level of professionalism, confidence, and ability to build credibility with team members and business partners. Preferred Requirements: Experience with or exposure to Agile methodologies with an understanding of DataOps principles. Exposure to Snowflake. Exposure to MS SQL Server. Experience developing CI/CD workflows and tools. Experience in automation scripting skills. Experience with or understanding of configuration management, test-driven development, and release management. Experience in the Financial Services Industry. Brooksource provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
    $86k-117k yearly est. 2d ago
  • Cloud Data Engineer

    GHR Healthcare 3.7company rating

    Data engineer job in Columbus, OH

    This is a 6 month contract and could be 2 different positions or 1 if someone has experience in both. Hybrid on site role so need to be local. Cloud Data engineer at S4 level · Person should have minimum hands on 5 years cloud data engineering experience (Specially on Azure, Databricks and MS Fabric) and overall minimum 10 to 15 years. · Handson experience with ELT & ETL pipelines development, Data modeling, AI/ML pipeline development, and unity catalog & Purview engineering experience. · Certifications on Azure cloud would be preferred.
    $93k-135k yearly est. 2d ago
  • Senior Data Engineer(only W2)

    CBTS 4.9company rating

    Data engineer job in Columbus, OH

    Bachelor's Degree in Computer Science or related technical field AND 5+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, or Java. Proficiency with Azure data services, such as Azure Data Lake, Azure Data Factory and Databricks. Expertise using Cloud Security (i.e., Active Directory, network security groups, and encryption services). Proficient in Python for developing and maintaining data solutions. Experience with optimizing or managing technology costs. Ability to build and maintain a data architecture supporting both real-time and batch processing. Ability to implement industry standard programming techniques by mastering advanced fundamental concepts, practices, and procedures, and having the ability to analyze and solve problems in existing systems. Expertise with unit testing, integration testing and performance/stress testing. Database management skills and understanding of legacy and contemporary data modeling and system architecture. Demonstrated leadership skills, team spirit, and the ability to work cooperatively and creatively across an organization Experience on teams leveraging Lean or Agile frameworks.
    $68k-95k yearly est. 4d ago
  • Data Engineer- ETL/ELT - Hybrid/Remote

    Crown Equipment Corporation 4.8company rating

    Remote data engineer job

    Crown Equipment Corporation is a leading innovator in world-class forklift and material handling equipment and technology. As one of the world's largest lift truck manufacturers, we are committed to providing the customer with the safest, most efficient and ergonomic lift truck possible to lower their total cost of ownership. Indefinite US Work Authorization Required. Primary Responsibilities Design, build and optimize scalable data pipelines and stores. Clean, prepare and optimize data for consumption in applications and analytics platforms. Participate in peer code reviews to uphold internal standards. Ensure procedures are thoroughly tested before release. Write unit tests and record test results. Detect, define and debug programs whenever problems arise. Provide training to users and knowledge transfer to support personnel and other staff members as required. Prepare system and programming documentation in accordance with internal standards. Interface with users to extract functional needs and determine requirements. Conduct detailed systems analysis to define scope and objectives and design solutions. Work with Business Analyst to help develop and write system requirements. Establish project plans and schedules and monitor progress providing status reports as required. Qualifications Bachelor's degree in Computer Science, Software/Computer Engineering, Information Systems, or related field is required. 4+ years' experience in SQL, ETL, ELT and SAP Data is required. Python, Databricks, Snowflakes experience preferred. Strong written, verbal, analytical and interpersonal skills are necessary. Remote Work: Crown offers hybrid remote work for this position. A reasonable commute is necessary as some onsite work is required. Relocation assistance is available. Work Authorization: Crown will only employ those who are legally authorized to work in the United States. This is not a position for which sponsorship will be provided. Individuals with temporary visas or who need sponsorship for work authorization now or in the future, are not eligible for hire. No agency calls please. Compensation and Benefits: Crown offers an excellent wage and benefits package for full-time employees including Health/Dental/Vision/Prescription Drug Plan, Flexible Benefits Plan, 401K Retirement Savings Plan, Life and Disability Benefits, Paid Parental Leave, Paid Holidays, Paid Vacation, Tuition Reimbursement, and much more. EOE Veterans/Disabilities
    $87k-109k yearly est. 3d ago
  • Senior Data Engineer

    Outcomes 3.7company rating

    Data engineer job in Dublin, OH

    The Senior Data Engineer is responsible for modeling complex problems, building pipelines, maintaining ETL processes and troubleshooting issues within a cloud environment. They utilize cloud databases, Databricks and databases to support a robust infrastructure which drives large, revenue generating data strategies. Additionally, the Senior Data Engineer suggests architecture ideas, works closely with architects and assists junior engineers with their tasks/approaches. Essential Duties And Responsibilities Participate in use case feasibility discussions and translate business idea / business problems into use cases. Provide support as needed to maintain and update models running in production environment. Develop and maintain complex ETL processes and algorithms Own and enhance ETL processes to move data from PMS to cloud databases Monitor and troubleshoot processes on a daily basis Document new and existing processes Proactively and independently identify performance issues and recommend enhancements Modernize legacy data models and pipelines using ETL tools and cloud database capabilities. Work closely with Chief Architect to make improvements and larger architectural changes Recommend or suggest improvements to the existing architecture. Support junior team members with technical questions, code reviews and code enhancements. Understand business problems/needs and provide proposed solutions. Qualifications KNOWLEDGE & REQUIREMENTS Required Qualifications Be able to work well with people of various backgrounds and education levels and establish cooperative working relationships with all coworkers. Timely and effectively communicate information to and consult with others in order to complete work assignments. Act in a responsible, trustworthy and ethical manner that considers the impact and consequences of one's actions or decisions. Communicate ideas, thoughts, and facts in writing through the use of proper grammar, spelling, document formatting and sentence structure. Identify and respond to current and future clients' needs; provide excellent client service. Evaluate and analyze problems or tasks from multiple perspectives; adaptively employ problem solving methods to find creative or novel solutions; use logical, systematic and sequential processes to solve problems. Complete assigned job tasks in an accurate and timely manner. Carefully prepare for meetings and presentations; follow up with others to ensure that agreements, tasks or commitments have been fulfilled. Demonstrate commitment to achieving Company's core business objectives of increasing the role of pharmacy and improving patient health in America. Desirable Qualifications Experience working with healthcare technology. Experience resolving issues that do not have clear answers. Experience with machine learning and applied statistics Experience with Java and Springboot Thorough experience with Databricks, Glue, Talend, Informatica or other similar ETL tools. Experience working in cloud databases (Redshift, BigQuery) Highly motivated and possessed excellent interpersonal, problem solving, and technical skills. High sense of urgency and accountability Adaptable, friendly, and ability to work with a team. Passion for data and digging into the minutia of datasets. Take calculated risks based on data-driven analytics A self-starter and enjoys working in a fast-paced environment Education & Experience Requirements Minimum years of work experience: 7 years 6+ years of data analysis experience Expertise in algorithm design, machine learning, and applied statistics Proven track record in use of SQL specifically in cloud databases and working with data including extracting information, validating data, creating and maintaining custom data structures Minimum level of education or education/experience: Bachelors Degree in Bachelor's degree in Business, Computer Science, Information Systems or equivalent combination of education and experience. Experience resolving issues that do not have clear answers. Excellent attendance Physical Requirements The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Frequent sitting in stationary position at a desk Occasional standing, walking, stooping, kneeling, squatting, and climbing stairs Occasional twisting of body Occasional reaching by extending hands and arms in any direction Occasional lifting, pulling, or pushing What's In It For You? Medical, Dental and Vision Plans Voluntary Benefits HSA & FSA Fertility & Family Planning Benefits Paid Parental Leave Adoption Assistance Program Employee Resource Groups Flex PTO for Exempt Associates & up to 15 PTO days in first year of employment for non-exempt associates 11 Paid Holidays Corporate Wellness Program 401k Employer Match & Roth Option Available - immediate eligibility
    $82k-115k yearly est. 3d ago
  • Big Data Developer

    Psi (Proteam Solutions 3.9company rating

    Data engineer job in Columbus, OH

    The Big Data Developer will play a critical role on the Big Data engineering team, designing and implementing large-scale data processing systems that power scientific research and innovation. The ideal candidate has hands-on experience building enterprise-grade data pipelines, working with distributed systems, and optimizing data processing workflows. This is a long-term project (12+ months) with high visibility, cutting-edge tools, and opportunities to influence technical direction. **U.S. CITIZENSHIP OR PERMANENT RESIDENCY REQUIRED. VISA OR EAD STATUS NOT ACCEPTABLE FOR THIS POSITION. NO EXCEPTIONS** What You'll Do ✔ Data Pipeline Design & Development Design, build, and deploy scalable data pipelines for ingesting, processing, transforming, and storing high-volume datasets. Implement streaming and batch-processing solutions using Hadoop, Spark, and cloud-based tools. ✔ Data Architecture & Engineering Develop and maintain data architecture and data flow models. Ensure data reliability, accuracy, and integrity across all environments. Support data warehousing strategies and best practices. ✔ Data Quality, Security & Compliance Implement automated data validation, error handling, and monitoring. Ensure compliance with internal security controls and regulatory standards. Partner with governance teams to enforce data quality and security guidelines. ✔ Cross-Functional Collaboration Work closely with data scientists, analysts, product teams, and application developers. Translate business requirements into robust technical solutions. Participate in Agile ceremonies and contribute to technical design discussions. ✔ Performance Optimization Tune Spark applications, Hadoop jobs, and distributed data systems for performance and cost efficiency. Troubleshoot bottlenecks and implement improvements to system performance. ✔ Technical Leadership Provide mentorship to junior developers and contribute to coding standards, best practices, and technical documentation. Required Skills & Qualifications 4+ years of Big Data Development experience in Hadoop ecosystems 2+ years of hands-on development with Apache Spark Proficiency in Java, Scala, or Python Strong understanding of distributed systems, ETL, data warehousing, and data modeling concepts Experience with large-scale datasets, performance tuning, and troubleshooting Strong problem-solving, communication, and collaboration skills Bachelor's degree in Computer Science, Engineering, or related discipline Preferred Skills Experience working with AWS cloud services (EMR, S3, Lambda, Glue, etc.) Experience with Spark 3.x or 4.x Exposure to Kubernetes, Airflow, or similar orchestration tools Familiarity with CI/CD and DevOps automation for data engineering Why This Opportunity Stands Out Long-term project stability (12+ months, likely extension) Ability to work on high-impact scientific and research-driven datasets Hands-on cloud modernization (AWS) and next-generation big data tooling Collaborative and innovative engineering culture
    $82k-106k yearly est. 2d ago
  • Data Engineer

    Iqventures

    Data engineer job in Dublin, OH

    The Data Engineer is a technical leader and hands-on developer responsible for designing, building, and optimizing data pipelines and infrastructure to support analytics and reporting. This role will serve as the lead developer on strategic data initiatives, ensuring scalable, high-performance solutions are delivered effectively and efficiently. The ideal candidate is self-directed, thrives in a fast-paced project environment, and is comfortable making technical decisions and architectural recommendations. The ideal candidate has prior experience in modern data platforms, most notable Databricks and the “lakehouse” architecture. They will work closely with cross-functional teams, including business stakeholders, data analysts, and engineering teams, to develop data solutions that align with enterprise strategies and business goals. Experience in the financial industry is a plus, particularly in designing secure and compliant data solutions. Responsibilities: Design, build, and maintain scalable ETL/ELT pipelines for structured and unstructured data. Optimize data storage, retrieval, and processing for performance, security, and cost-efficiency. Ensure data integrity and governance by implementing robust validation, monitoring, and compliance processes. Consume and analyze data from the data pipeline to infer, predict and recommend actionable insight, which will inform operational and strategic decision making to produce better results. Empower departments and internal consumers with metrics and business intelligence to operate and direct our business, better serving our end customers. Determine technical and behavioral requirements, identify strategies as solutions, and section solutions based on resource constraints. Work with the business, process owners, and IT team members to design solutions for data and advanced analytics solutions. Perform data modeling and prepare data in databases for analysis and reporting through various analytics tools. Play a technical specialist role in championing data as a corporate asset. Provide technical expertise in collaborating with project and other IT teams, internal and external to the company. Contribute to and maintain system data standards. Research and recommend innovative, and where possible automated approaches for system data administration tasks. Identify approaches that leverage our resources and provide economies of scale. Engineer system that balances and meets performance, scalability, recoverability (including backup design), maintainability, security, high availability requirements and objectives. Skills: Databricks and related - SQL, Python, PySpark, Delta Live Tables, Data pipelines, AWS S3 object storage, Parquet/Columnar file formats, AWS Glue. Systems Analysis - The application of systems analysis techniques and procedures, including consulting with users, to determine hardware, software, platform, or system functional specifications. Time Management - Managing one's own time and the time of others. Active Listening - Giving full attention to what other people are saying, taking time to understand the points being made, asking questions as appropriate, and not interrupting at inappropriate times. Critical Thinking - Using logic and reasoning to identify the strengths and weaknesses of alternative solutions, conclusions or approaches to problems. Active Learning - Understanding the implications of new information for both current and future problem-solving and decision-making. Writing - Communicating effectively in writing as appropriate for the needs of the audience. Speaking - Talking to others to convey information effectively. Instructing - Teaching others how to do something. Service Orientation - Actively looking for ways to help people. Complex Problem Solving - Identifying complex problems and reviewing related information to develop and evaluate options and implement solutions. Troubleshooting - Determining causes of operating errors and deciding what to do about it. Judgment and Decision Making - Considering the relative costs and benefits of potential actions to choose the most appropriate one. Experience and Education: High School Diploma (or GED or High School Equivalence Certificate). Associate degree or equivalent training and certification. 5+ years of experience in data engineering including SQL, data warehousing, cloud-based data platforms. Databricks experience. 2+ years Project Lead or Supervisory experience preferred. Must be legally authorized to work in the United States. We are unable to sponsor or take over sponsorship at this time.
    $76k-103k yearly est. 4d ago
  • Data Scientist with Hands On development experience with R, SQL & Python

    Central Point Partners 3.7company rating

    Data engineer job in Columbus, OH

    *Per the client, No C2C's!* Central Point Partners is currently interviewing candidates in the Columbus, Oh area for a large client. only GC's and USC's. This position is Hybrid (4 Days onsite)! Only candidates who are local to Columbus, Oh will be considered. Data Scientist with Hands On development experience with R, SQL & Python Summary: Our client is seeking a passionate, data-savvy Senior Data Scientist to join the Enterprise Analytics team to fuel our mission of growth through data-driven insights and opportunity discovery. This dynamic role uses a consultative approach with the business segments to dive into our customer, product, channel, and digital data to uncover opportunities for consumer experience optimization and customer value delivery. You will also enable stakeholders with actionable, intuitive performance insights that provide the business with direction for growth. The ideal candidate will have a robust mix of technical and communication skills, with a passion for optimization, data storytelling, and data visualization. You will collaborate with a centralized team of data scientists as well as teams across the organization including Product, Marketing, Data, Finance, and senior leadership. This is an exciting opportunity to be a key influencer to the company's strategic decisions and to learn and grow with our Analytics team. Notes from the manager The skills that will be critical will be Python or R and a firm understanding of SQL along with foundationally understanding what data is needed to perform studies now and in the future. For a high-level summary that should help describe what this person will be asked to do alongside their peers: I would say this person will balance analysis with development, knowing when to jump in and knowing when to step back to lend their expertise. Feature & Functional Design Data scientists are embedded in the team's designing the feature. Their main job here is to define the data tracking needed to evaluate the business case-things like event logging, Adobe tagging, third-party data ingestion, and any other tracking requirements. They are also meant to consult and outline if/when business should be bringing data into the bank and will help connect business with CDAO and IT warehousing and data engineering partners should new data need to be brought forward. Feature Engineering & Development The same data scientists stay involved as the feature moves into execution. They support all necessary functions (Amigo, QA, etc.) to ensure data tracking is in place when the feature goes live. They also begin preparing to support launch evaluation and measurement against experimentation design or business case success criteria. Feature Rollout & Performance Evaluation Owns tracking the rollout, running A/B tests, and conducting impact analysis for all features that they have been involved in the Feature & Functional Design and Feature Engineering & Development stages. They provide an unbiased view of how the feature performs against the original business case along with making objective recommendations that will provide direction for business. They will roll off once the feature has matured through business case/experiment design and evaluation. In addition to supporting feature rollouts… Data scientists on the team are also encouraged to pursue self-driven initiatives during periods when they are not actively supporting other projects. These initiatives may include designing experiments, conducting exploratory analyses, developing predictive models, or identifying new opportunities for impact. For more information about this opportunity, please contact Bill Hart at ************ AND email your resume to **********************************!
    $58k-73k yearly est. 1d ago
  • Exceptional Data Scientists

    Mercor

    Remote data engineer job

    Mercor is seeking data scientists to support one of the world's leading AI labs in building robust, high‑performance systems for next‑generation machine learning applications. In this role, you will focus on hands‑on data science tasks, such as designing experiments, gathering and preprocessing data, building and evaluating models, and collaborating closely with engineering teams to deploy production‑ready solutions. Ideal candidates should be proficient in Python (Jupyter Notebooks), familiar with machine learning frameworks like TensorFlow or PyTorch, and experienced in analyzing large datasets and building predictive models. In addition, you will write, review, and validate prompt‑based questions used to train AI systems. * * * **You are a good fit if you:** - Have over 3+ years of professional experience in data science or applied analytics. - Are **highly skilled in Python and Jupyter notebooks.** - **Have experience using libraries including** **numpy, pandas, scipy, sympy, scikit-learn, torch, tensorflow.** - Have a bachelor's degree in data science, statistics, computer science, or related field in the U.S., Canada, New Zealand, UK or Australia. - Have a strong background in one or more of the following areas: exploratory data analysis and statistical inference, machine learning workflows and model evaluation, feature engineering/data preprocessing/data wrangling, or A/B testing/experimentation/causal inference. - Demonstrate excellent verbal and written communication skills. - Have strong attention to details. * * * ### **More About the Opportunity** - **Commitment:** 20-40 hours per week, with potential to scale up to 80 hours. - **Structure:** Fully remote and asynchronous - set your own schedule. - **Duration:** Minimum 1 months, with possible extension based on project scope and performance. - **Start Date:** Early September, with applications reviewed on a rolling basis. * * * ### **Compensation & Contract Terms** - Classified as an **independent contractor** for Mercor. - Hourly payments for services rendered, processed weekly via **Stripe Connect**. - Flexible terms based on location and expertise level. * * * ### **Application Process** - Submit your resume or project portfolio to get started. - Complete a brief form outlining your technical background and agent usage experience. - Applications are reviewed continuously, and selected professionals will be contacted with next steps. * * * ### **About Mercor** - Mercor is a talent marketplace connecting top experts with leading AI research labs and organizations. - Our investors include **Benchmark**, **General Catalyst**, **Adam D'Angelo**, **Larry Summers**, and **Jack Dorsey**. - Thousands of technical professionals partner with Mercor to contribute to frontier AI projects shaping the next era of intelligent systems. * * *
    $83k-117k yearly est. 35d ago
  • Data Engineer

    Agility Partners 4.6company rating

    Data engineer job in Columbus, OH

    We're seeking a skilled Data Engineer based in Columbus, OH, to support a high-impact data initiative. The ideal candidate will have hands-on experience with Python, Databricks, SQL, and version control systems, and be comfortable building and maintaining robust, scalable data solutions. Key Responsibilities Design, implement, and optimize data pipelines and workflows within Databricks. Develop and maintain data models and SQL queries for efficient ETL processes. Partner with cross-functional teams to define data requirements and deliver business-ready solutions. Use version control systems to manage code and ensure collaborative development practices. Validate and maintain data quality, accuracy, and integrity through testing and monitoring. Required Skills Proficiency in Python for data engineering and automation. Strong, practical experience with Databricks and distributed data processing. Advanced SQL skills for data manipulation and analysis. Experience with Git or similar version control tools. Strong analytical mindset and attention to detail. Preferred Qualifications Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with enterprise data lake architectures and best practices. Excellent communication skills and the ability to work independently or in team environments.
    $95k-127k yearly est. 1d ago
  • Data Scientist

    Neudesic, An IBM Company

    Remote data engineer job

    About Neudesic Passion for technology drives us, but it's innovation that defines us . From design to development and support to management, Neudesic offers decades of experience, proven frameworks and a disciplined approach to quickly deliver reliable, quality solutions that help our customers go to market faster. What sets us apart from the rest, is an amazing collection of people who live and lead with our core values. We believe that everyone should be Passionate about what they do, Disciplined to the core, Innovative by nature, committed to a Team and conduct themselves with Integrity. If these attributes mean something to you - we'd like to hear from you. Role Summary Neudesic is currently seeking a Senior Consultant - Data Scientist for our Data & AI Practice. This role will report to the Regional Director of Data & AI. The role requires the perfect mix of being a brilliant technologist and also a deep appreciation for how technology drives business value. You will have broad and deep technology knowledge and the ability to architect solutions by mapping client business problems to end-to-end technology solutions. Demonstrated ability to engage senior level technology decision makers in data management, real-time analytics, predictive analytics and data visualization is a must. To be successful, you must exhibit strong leadership qualities necessary for building trust with clients and our technologists, with the ability to deliver ML/DL projects to successful completion. You will partner with solution architects to drive client success by providing practical guidance based on your years of experience in data management and visualization solutions. You will partner with a diverse sales unit to professionally represent Neudesic experience and ability to drive business results. In addition, you will assist in creating sales assets that clearly communicate our value proposition to technical decision makers. Requirements: 3+ years of hands-on experience in building and deploying machine learning and/or generative AI solutions in a business environment Bachelor's degree in a quantitative field (e.g., Computer Science, Engineering, Business Analytics, Statistics, Mathematics); advanced degree preferred Proven expertise with Databricks and Microsoft Fabric for data engineering and ML workflows Proficient in Python (including ML/AI libraries such as scikit-learn, PyTorch, TensorFlow) and SQL; experience with R is a plus Strong understanding of statistical modeling, machine learning, MLOps, and large language models (LLMs) Demonstrated ability to work with large, complex datasets and relational/cloud-based databases (e.g., Azure SQL, Delta Lake) Experience operationalizing ML models and pipelines in production environments Preferred Qualifications Experience with cloud platforms, especially Microsoft Azure (Azure Machine Learning, Azure Data Lake, etc.) Familiarity with modern data engineering tools and orchestration frameworks (e.g., dbt, Apache Airflow, Delta Live Tables) Experience with data visualization and BI tools (e.g., Power BI, Tableau) Ability to communicate complex analytical and technical concepts to both technical and non-technical audiences Strong data storytelling skills and ability to drive insights from exploratory and visual analysis Intellectual curiosity, strong problem-solving skills, and a collaborative, team-oriented mindset Demonstrated leadership, relationship-building, and mentoring abilities More about our Predictive Enterprise Service Line: The digital business uses data as a competitive differentiator. The explosion of big data, machine learning and cloud computing power creates an opportunity to make a quantum leap forward in business understanding and customer engagement. The availability of massive amounts of information, massive computing power and advancements in artificial intelligence allow the digital business to more accurately predict, plan for and capture opportunity unlike ever before. The predictive enterprise service line the evolution from using data strictly as a reporting mechanism of what's happened to leveraging the latest in advanced analytics to predict and prescribe future business action. Our services include: · Data Management Solutions: We build architectures, policies, practices and procedures that manage the full data lifecycle of an enterprise. We bring internal and exogenous datasets together to formulate new perspectives and drive to data-thinking. · Self-Service Data Solutions: We create classic self-service and modern data-blending solutions that enable end-users to enrich pre-authored analytic reports by blending them with additional data sources. · Real-Time Analytic Solutions: We build real-time analytics solutions on data-in-motion that eliminate the dependency on stale and static data sets resulting in the ability to immediately query and analyze diverse data sets. · Machine Learning Solutions: We build machine-learning solutions that support the most complex decision support systems Accommodations currently remain in effect for Neudesic employees to work remotely, provided that remote work is consistent with the work patterns and requirements of their team's management and client obligations. Subject to business needs, employees may be required to perform work or attend meetings on-site at a client or Neudesic location. Phishing Scam Notice Please be aware of phishing scams involving fraudulent career recruiting and fictitious job postings; visit our Phishing Scams page to learn more. Neudesic is an Equal Opportunity Employer All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. Neudesic is an IBM subsidiary which has been acquired by IBM and will be integrated into the IBM organization. Neudesic will be the hiring entity. By proceeding with this application, you understand that Neudesic will share your personal information with other IBM companies involved in your recruitment process, wherever these are located. More Information on how IBM protects your personal information, including the safeguards in case of cross-border data transfer, are available here: ***************************************************
    $71k-101k yearly est. 1d ago
  • Senior Data Architect

    Intelliswift-An LTTS Company

    Data engineer job in Marysville, OH

    4 days onsite - Marysville, OH Skillset: Bachelor's degree in computer science, data science, engineering, or related field 10 years minimum relevant experience in design and implementation of data models (Erwin) for enterprise data warehouse initiatives Experience leading projects involving cloud data lakes, data warehousing, data modeling, and data analysis Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS), real-time data distribution (Kinesis, Kafka, Dataflow), and modern data warehouse tools (Redshift, Snowflake, Databricks) Experience with various database platforms, including DB2, MS SQL Server, PostgreSQL, Couchbase, MongoDB, etc. Understanding of entity-relationship modeling, metadata systems, and data security, quality tools and techniques Ability to design traditional/relational and modern big-data architecture based on business needs Experience with business intelligence tools and technologies such as Informatica, Power BI, and Tableau Exceptional communication and presentation skills Strong analytical and problem-solving skills Ability to collaborate and excel in complex, cross-functional teams involving data scientists, business analysts, and stakeholders Ability to guide solution design and architecture to meet business needs.
    $93k-125k yearly est. 2d ago
  • Software Engineer

    Impower.Ai 3.8company rating

    Data engineer job in Columbus, OH

    Software Engineer - Internal Product Team Division: Impower Solutions (Agility Partners) About Impower Impower is the technology consulting division of Agility Partners, specializing in automation & AI, data engineering & analytics, software engineering, and digital transformation. We deliver high-impact solutions with a focus on innovation, efficiency, and client satisfaction. Role Overview We're building a high-performing internal product team to scale our proprietary tech stack. As a Software Engineer, you'll contribute to the development of internal platforms using modern technologies. You'll collaborate with product and engineering peers to deliver scalable, maintainable solutions that drive Impower's consulting capabilities. Key Responsibilities Development & Implementation Build scalable APIs using TypeScript and Bun for high-performance backend services. Develop intelligent workflows and AI agents leveraging Temporal, enabling robust orchestration and automation. Move and transform data using Python and DBT, supporting analytics and operational pipelines. Contribute to full-stack development of internal websites using Next.js (frontend), Elysia (API layer), and Azure SQL Server (database). Implement CI/CD pipelines using GitHub Actions, with a focus on automated testing, secure deployments, and environment consistency. Deploy and manage solutions in Azure, including provisioning and maintaining infrastructure components such as App Services, Azure Functions, Storage Accounts, and SQL databases. Monitor and troubleshoot production systems using SigNoz, ensuring observability across services with metrics, traces, and logs to maintain performance and reliability. Write clean, testable code and contribute to unit, integration, and end-to-end test suites. Collaborate in code reviews, sprint planning, and backlog grooming to ensure alignment and quality across the team. Innovation & Strategy Stay current with emerging technologies and frameworks, especially in the areas of agentic AI, orchestration, and scalable infrastructure. Propose improvements to internal platforms based on performance metrics, developer experience, and business needs. Contribute to technical discussions around design patterns, tooling, and long-term platform evolution. Help evaluate open-source tools and third-party services that could accelerate development or improve reliability. Delivery & Collaboration Participate in agile ceremonies including sprint planning, standups, and retrospectives. Collaborate closely with product managers, designers, and other engineers to translate requirements into working solutions. Communicate progress, blockers, and technical decisions clearly and proactively. Take ownership of assigned features and enhancements from ideation through deployment and support. Leadership Demonstrate ownership and accountability in your work, contributing to a culture of reliability and continuous improvement. Share knowledge through documentation, pairing, and informal mentoring of junior team members. Engage in code reviews to uphold quality standards and foster team learning. Actively participate in team discussions and help shape a collaborative, inclusive engineering culture. Qualifications 2-4 years of experience in software engineering, ideally in a product-focused or platform engineering environment. Proficiency in TypeScript and Python, with hands-on experience in full-stack development. Experience building APIs and backend services using Bun, Elysia, or similar high-performance frameworks (e.g., Fastify, Express, Flask). Familiarity with Next.js for frontend development and Azure SQL Server for relational data storage. Experience with workflow orchestration tools such as Temporal, Airflow, or Prefect, especially for building intelligent agents or automation pipelines. Proficiency in data transformation using DBT, with a solid understanding of analytics engineering principles. Strong understanding of CI/CD pipelines using GitHub Actions, including automated testing, environment management, and secure deployments. Exposure to observability platforms such as SigNoz, Grafana, Prometheus, or OpenTelemetry, with a focus on metrics, tracing, and log aggregation. Solid grasp of software testing practices and version control (Git). Excellent communication skills, a collaborative mindset, and a willingness to learn and grow within a team. Why Join Us? Build impactful internal products that shape the future of Impower's consulting capabilities. Work with cutting-edge technologies in a collaborative, innovation-driven environment. Enjoy autonomy, growth opportunities, and a culture that values excellence and people.
    $57k-75k yearly est. 4d ago
  • Java Software Engineer

    Stellar Consulting Solutions, LLC

    Data engineer job in Columbus, OH

    • 10+ years of software development experience • Strong hands-on experience in Java, Spring boot. • Design and implement scalable and robust architecture solutions using Java and Spring Framework. • Oversee the development and integration of applications using Spring Boot and related technologies. • Provide technical guidance and mentorship to the development team to ensure best practices are followed. • Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. • Ensure code quality and maintainability. • Experience in implementing Java Spring Boot, Excellent expertise in Java 8, Multithreading, Microservices. • Ensure data integrity and efficient data handling using JSON and XML. • Utilize GIT for version control and collaborate with team members on code reviews. • Troubleshoot and resolve technical issues in a timely manner to minimize downtime. • Stay updated with the latest industry trends and technologies to continuously improve the architecture and development processes
    $64k-85k yearly est. 23h ago
  • Java Software Engineer

    People Consultancy Services (PCS

    Data engineer job in Columbus, OH

    Core Java Developer with AWS Local nearby Candidates required W2 Candidates only A) Junior Profile 3 to 5 years' Experience Must be strong in Java/Microservices/Spring/AWS/Kafka Should be strong with AWS deployment - Team is using ECS/EKS within AWS Terraform is a huge plus Docker is a huge plus B) Senior Profile 6+ years' Experience Must be strong in Java/Microservices/Spring/AWS/Kafka Should be strong with AWS deployment - Team is using ECS/EKS within AWS Terraform is a huge plus Docker is a huge plus
    $64k-85k yearly est. 2d ago
  • Software Engineer - Data Platform (Remote)

    Mastercard 4.7company rating

    Remote data engineer job

    Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Software Engineer II Provides support of applications software through programming, analysis, design, development and delivery of software solutions. Responsible for programming, testing, implementation, documentation, maintenance and support of systems application software in adherence with MasterCard standards, processes and best practices. •Develop high quality, secure, scalable software solutions based on technical requirements specifications and design artifacts within expected time and budget. •Perform feasibility studies, logic designs, detailed systems flowcharting, analysis of input-output flow, cost and time analysis. •Work with project team to meet scheduled due dates, while identifying emerging issues and recommending solutions for problems and independently perform assigned tasks, perform production incident management. Participate in on-call pager support rotation. •Document software programs per Software Development Best Practices. Follow MasterCard Quality Assurance and Quality Control processes. •Assist Senior Team members in modification of the documentation templates per the needs of the project and technology. •Contribute ideas to help ensure that required standards and processes are in place and actively look for opportunities to enhance standards and improve process efficiency. •Support collection and reporting of project and software metrics. Education: •Bachelor's degree in Information Technology, Computer Science or Management Information Systems or equivalent work experience. Knowledge / Experience: •Thorough knowledge and understanding of Software Engineering Concepts and Methodologies is required. •1 to 3 years of experience in software engineering field. Ability to work as a member of matrix based diverse and geographically distributed project team. Mastercard is a merit-based, inclusive, equal opportunity employer that considers applicants without regard to gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law. In the US or Canada, if you require accommodations or assistance to complete the online application process or during the recruitment process, please contact reasonable_accommodation@mastercard.com and identify the type of accommodation or assistance you are requesting. Do not include any medical or health information in this email. All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard's security policies and practices; Report any suspected information security violation or breach, and In line with Mastercard's total compensation philosophy and assuming that the job will be performed in the US, the successful candidate will be offered a competitive base salary and may be eligible for an annual bonus or commissions depending on the role. Mastercard benefits for full time (and certain part time) employees generally include: insurance (including medical, prescription drug, dental, vision, disability, life insurance); flexible spending account and health savings account; 80 hours of Paid Sick and Safe Time, 25 days of vacation time and 5 personal days, pro-rated based on date of hire; S. observed holidays; fitness reimbursement or on-site fitness facilities; eligibility for tuition reimbursement;
    $75k-97k yearly est. 2d ago
  • Workday Software Engineer

    Decca Recruiting

    Remote data engineer job

    Positions: Software Engineer, Workday Duration: Full time position Type: Remote work model. . A day of this role: As fully remote, this role works extensively on Workday integration projects . Responsible for designing, developing, configuring, integrating, and maintaining Workday applications and solutions. Collaborates with cross-functional teams to support business needs. Operates independently with minimal supervision. Must haves: 7+ years of Workday Integration experience. Understanding of Workday data conversion patterns and tools. Proficiency in Workday integration tools: EIB Connectors Workday Studio Familiarity with Workday Business Process Framework. Experience with Workday modules: HCM, Benefits, Time Tracking, Payroll and Security Workday certifications. Working knowledge of: Workday Extend Workday Report Writer Calculated fields Prism Analytics RaaS (Reports as a Service) Strong understanding of: Web technologies Mobile platforms APIs (WSDL, SOAP, REST) SQL Responsibilities: Works with constituent departments to fulfill design, application development, configuration, integration, support, and maintenance requests. Assists in scope definition and estimation of work effort. Contributes to the business requirements gathering process. Works with the architecture team to ensure that design standards are followed. Adheres to defined processes. Develops application code to fulfill project requests. Creates technical documentation as required. Drives incremental improvements to team technical processes and practices. Mentors development team members in technical complexities of assigned work. Stays up to date with Workday releases, updates, and new features, and applies this knowledge to improve integration/extend solutions, design and performance. Qualifications: Bachelor's degree in computer science, a related field, or four years of related work experience is required. Three to five years of professional experience is required. Strong understanding of web, mobile, API, and SQL technologies. Broad knowledge of software development practices and procedures. Experience working with Workday modules such as HCM, Benefits, Time Tracking, Payroll and Security. Good understanding of Workday Business Process Framework. Good knowledge of Workday integration tools such as EIB, Connectors, Workday Studio. Working knowledge of Workday Extend. Working knowledge of Workday Report Writer, calculated fields, Prism. Working knowledge of Web Services, APIs (WSDL, SOAP, REST) and RaaS. Knowledge of Workday data conversion patterns and toolset. Aptitude for continuous learning and improvement. Strong teamwork skills.
    $69k-94k yearly est. 2d ago
  • Software Engineer (Remote)

    It Associates 3.4company rating

    Remote data engineer job

    Remote (proximity to Chicago, Nashville or Manhattan would be a big plus) Regular travel is not required but will need to travel to corporate office 2 times a year Our client is looking to add a Software Developer that will be responsible for designing, developing, and maintaining high-quality software solutions that support the Firm's digital platforms. This role ensures the stability, scalability, and performance of all applications and services, while collaborating with cross-functional teams to drive continuous improvement in development practices and operational efficiency. Responsibilities Design and implement stable, scalable, and extensible software solutions. Ensure adherence to secure software development lifecycle (SDLC) best practices and standards. Drive the design and development of services and applications to meet defined service level agreements (SLAs). Work closely with end users and stakeholders to gather requirements and iterate on solutions that deliver business value. Proactively identify and resolve any obstacles affecting operational efficiency and service continuity. Provide ongoing support for developed applications and services, ensuring timely issue resolution. Participate in the Firm's change and incident management processes, adhering to established protocols. Software Development & Architecture Develop and maintain features for web-enabled applications using C# .NET Core. Write clean, scalable code with a focus on maintainability and performance. Implement robust, efficient SQL-based solutions, preferably using MS SQL. Develop and maintain user interfaces using modern frameworks, preferably Angular or Blazor. Ensure solutions are designed with an emphasis on security, efficiency, and optimization. Contribute to continuous integration and continuous delivery (CI/CD) pipelines, automating processes where possible. Collaboration & Optimization Collaborate closely with business analysts, quality assurance, and other developers to ensure solutions meet both functional and non-functional requirements. Foster a culture of positive, open communication across diverse teams, with a focus on collaboration and shared goals. Engage in regular reviews and feedback sessions to drive continuous improvement in development processes and practices. Provide mentorship and guidance to junior developers where appropriate, supporting their professional growth. Professional Conduct Demonstrates commitment to the firm's core values, including Accountability, Integrity, Excellence, Grit, and Love. Ensures all activities align with business objectives and project timelines. Communicates effectively, openly exchanging ideas and listening with consideration. Maintains a proactive, solution-oriented mindset when addressing challenges. Takes ownership of responsibilities and holds others accountable for their contributions. Continuously seeks opportunities to optimize processes, improve performance, and drive innovation. Qualifications 1-3+ years of expertise in C# .NET Core development Competence in SQL, preferably MS SQL Competence in UI work, preferably Angular and/or Blazor Strong structured problem-solving skills, with a history of using systematic and fact-based processes to improve mission-critical services. A focus on optimization and efficiency in processes. Experience working in a financial services firm would be a big plus Demonstrated expertise in fostering a culture of positive collaboration among cross-functional teams with diverse personalities, skill sets, and levels of experience. Highly developed communication skills A sense of urgency and a bias for action. For all non-bonus, non-commission direct hire positions: The anticipated salary range for this position is ($95,000 - $120,000). Actual salary will be based on a variety of factors including relevant experience, knowledge, skills and other factors permitted by law. A range of medical, dental, vision, retirement, paid time off, and/or other benefits are available.
    $95k-120k yearly 4d ago

Learn more about data engineer jobs

Data engineer jobs by state

Data engineer jobs by city

Browse computer and mathematical jobs