Senior Full Stack Developer
Data engineer job in Dublin, OH
is located at our Dublin, OH campus with hybrid flexibility.
Who we are
Founded in 1999 and headquartered in Central Ohio, we're a privately-owned, independent healthcare navigation organization. We believe that no one should have to navigate the cost and complexity of healthcare alone, and we're on a mission to make healthcare simpler and more effective for our millions of members. Our big-hearted, tech-savvy team fights to ensure that our members get the care they need, when they need it, at the most affordable cost - that's why we call ourselves Healthcare Warriors .
We're committed to building diverse and inclusive teams - more than 2,000 of us and counting - so if you're excited about this position, we encourage you to apply - even if your experience doesn't match every requirement.
About the role
We are seeking a Senior Full Stack Developer to play a pivotal role in advancing our enterprise analytics platform and transforming the healthcare navigation experience. This role will serve as both a technical leader and hands-on contributor-driving architectural decisions, mentoring peers, and shaping how we integrate modern technologies, including LLMs (Large Language Models), into our analytics solutions.
What you'll do (Essential Responsibilities)
Lead the design, architecture, and development of complex full stack features for our analytics platform.
Provide technical direction for integrating LLM-driven capabilities to enhance data interpretation and user interaction.
Establish and enforce engineering best practices around scalability, performance, and security.
Build and refine user interfaces in Vue.js, ensuring intuitive, accessible, and performant experiences.
Design and optimize back-end services in Node.js, Python, or similar, with strong API and integration patterns.
Architect and manage integrations with AWS QuickSight, Snowflake, PostgreSQL, and other enterprise data sources.
Partner with product, design, and data teams to define the technical roadmap and deliver high-value features.
Mentor junior developers, fostering technical growth and knowledge sharing within the team.
Participate in and lead code reviews, promoting a culture of quality and continuous improvement.
Explore, prototype, and implement AI/ML and LLM capabilities that unlock new insights and client value.
Stay ahead of emerging trends in full stack and AI technologies, bringing forward recommendations that shape product strategy.
All other duties as assigned.
What you'll bring (Qualifications)
Experience: 7+ years of professional full stack development experience, with demonstrated progression into senior or lead roles.
Expertise in Vue.js and modern JavaScript/TypeScript frameworks.
Strong back-end development experience with Node.js, Python, or similar languages.
Advanced proficiency with SQL, data modeling, and performance optimization in Snowflake and PostgreSQL.
Deep knowledge of AWS services, including QuickSight, and familiarity with scalable cloud-native architectures.
Experience architecting and deploying secure, high-performance applications in enterprise environments.
Strong understanding of RESTful APIs, authentication, and security best practices.
Excellent communication skills, with the ability to translate complex technical concepts for business stakeholders.
Commitment to data security, privacy, and compliance standards.
Trustworthy and accountable behavior, capable of viewing and maintaining confidential information daily.
Preferred Qualifications
Prior experience in healthcare, healthtech, or other regulated industries.
Hands-on experience enabling LLM-powered features through APIs (e.g., OpenAI, AWS Bedrock).
Familiarity with AI/ML integration in analytics environments.
Knowledge of CI/CD pipelines, Docker, and modern DevOps practices.
Understanding of regulatory frameworks such as HIPAA.
--
#LI-AK1 #LI-Hybrid
What's in it for you
Compensation: Competitive base and incentive compensation
Coverage: Health, vision and dental featuring our best-in-class healthcare navigation services, along with life insurance, legal and identity protection, adoption assistance, EAP, Teladoc services and more.
Retirement: 401(k) plan with up to 4% employer match and full vesting on day one.
Balance: Paid Time Off (PTO), 7 paid holidays, parental leave, volunteer days, paid sabbaticals, and more.
Development: Tuition reimbursement up to $5,250 annually, certification/continuing education reimbursement, discounted higher education partnerships, paid trainings and leadership development.
Culture: Recognition as a Best Place to Work for 15+ years, dedication to diversity, philanthropy and sustainability, and people-first values that drive every decision.
Environment: A modern workplace with a casual dress code, open floor plans, full-service dining, free snacks and drinks, complimentary 24/7 fitness center with group classes, outdoor walking paths, game room, notary and dry-cleaning services and more!
What you should know
Internal Associates: Already a Healthcare Warrior? Apply internally through Jobvite.
Process: Application > Phone Screen > Online Assessment(s) > Interview(s) > Offer > Background Check.
Diversity, Equity and Inclusion: Quantum Health welcomes everyone. We value our diverse team and suppliers, we're committed to empowering our ERGs, and we're proud to be an equal opportunity employer .
Tobacco-Free Campus: To further enable the health and wellbeing of our associates and community, Quantum Health maintains a tobacco-free environment. The use of all types of tobacco products is prohibited in all company facilities and on all company grounds.
Compensation Ranges: Compensation details published by job boards are estimates and not verified by Quantum Health. Details surrounding compensation will be disclosed throughout the interview process. Compensation offered is based on the candidate's unique combination of experience and qualifications related to the position.
Sponsorship: Applicants must be legally authorized to work in the United States on a permanent and ongoing future basis without requiring sponsorship.
Agencies: Quantum Health does not accept unsolicited resumes or outreach from third-parties. Absent a signed MSA and request/approval from Talent Acquisition to submit candidates for a specific requisition, we will not approve payment to any third party.
Reasonable Accommodation: Should you require reasonable accommodation(s) to participate in the application/interview/selection process, or in order to complete the essential duties of the position upon acceptance of a job offer, click here to submit a recruitment accommodation request.
Recruiting Scams: Unfortunately, scams targeting job seekers are common. To protect our candidates, we want to remind you that authorized representatives of Quantum Health will only contact you from an email address ending **********************. Quantum Health will never ask for personally identifiable information such as Date of Birth (DOB), Social Security Number (SSN), banking/direct/tax details, etc. via email or any other non-secure system, nor will we instruct you to make any purchases related to your employment. If you believe you've encountered a recruiting scam, report it to the Federal Trade Commission and your state's Attorney General.
Junior Data Engineer
Data engineer job in Columbus, OH
Contract-to-Hire
Columbus, OH (Hybrid)
Our healthcare services client is looking for an entry-level Data Engineer to join their team. You will play a pivotal role in maintaining and improving inventory and logistics management programs. Your day-to-day work will include leveraging machine learning and open-source technologies to drive improvements in data processes.
Job Responsibilities
Automate key processes and enhance data quality
Improve injection processes and enhance machine learning capabilities
Manage substitutions and allocations to streamline product ordering
Work on logistics-related data engineering tasks
Build and maintain ML models for predictive analytics
Interface with various customer systems
Collaborate on integrating AI models into customer service
Qualifications
Bachelor's degree in related field
0-2 years of relevant experience
Proficiency in SQL and Python
Understanding of GCP/BigQuery (or any cloud experience, basic certifications a plus).
Knowledge of data science concepts.
Business acumen and understanding (corporate experience or internship preferred).
Familiarity with Tableau
Strong analytical skills
Attitude for collaboration and knowledge sharing
Ability to present confidently in front of leaders
Why Should You Apply?
You will be part of custom technical training and professional development through our Elevate Program!
Start your career with a Fortune 15 company!
Access to cutting-edge technologies
Opportunity for career growth
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
Lead Data Scientist
Data engineer job in Columbus, OH
Candidates MUST go on-site at one of the following locations
Columbus, OH
Cincinnati, OH
Cleveland, OH
Indianapolis, IN
Hagerstown, MD
Chicago, IL
Detroit, MI
Minnetonka, MN
Houston, TX
Charlotte, NC
Akron, OH
Experience:
· Master's degree and 5+ years of experience related work experience using statistics and machine learning to solve complex business problems, experience conducting statistical analysis with advanced statistical software, scripting languages, and packages, experience with big data analysis tools and techniques, and experience building and deploying predictive models, web scraping, and scalable data pipelines
· Expert understanding of statistical methods and skills such as Bayesian Networks Inference, linear and non-linear regression, hierarchical, mixed models/multi-level modeling
Python, R, or SAS SQL and some sort of lending experience (i.e. HELOC, Mortgage etc) is most important
Excellent communication skills
If a candidate has cred card experience (i.e. Discover or Bread financial ) THEY ARE A+ fit!
Education:
Master's degree or PhD in computer science, statistics, economics or related fields
Responsibilities:
· Prioritizes analytical projects based on business value and technological readiness
Performs large-scale experimentation and build data-driven models to answer business questions
Conducts research on cutting-edge techniques and tools in machine learning/deep learning/artificial intelligence
Evangelizes best practices to analytics and products teams
Acts as the go-to resource for machine learning across a range of business needs
Owns the entire model development process, from identifying the business requirements, data sourcing, model fitting, presenting results, and production scoring
Provides leadership, coaching, and mentoring to team members and develops the team to work with all areas of the organization
Works with stakeholders to ensure that business needs are clearly understood and that services meet those needs
Anticipates and analyzes trends in technology while assessing the emerging technology's impact(s)
Coaches' individuals through change and serves as a role model
Skills:
· Up-to-date knowledge of machine learning and data analytics tools and techniques
Strong knowledge in predictive modeling methodology
Experienced at leveraging both structured and unstructured data sources
Willingness and ability to learn new technologies on the job
Demonstrated ability to communicate complex results to technical and non-technical audiences
Strategic, intellectually curious thinker with focus on outcomes
Professional image with the ability to form relationships across functions
Ability to train more junior analysts regarding day-to-day activities, as necessary
Proven ability to lead cross-functional teams
Strong experience with Cloud Machine Learning technologies (e.g., AWS Sagemaker)
Strong experience with machine learning environments (e.g., TensorFlow, scikit-learn, caret)
Demonstrated Expertise with at least one Data Science environment (R/RStudio, Python, SAS) and at least one database architecture (SQL, NoSQL)
Financial Services background preferred
Senior Data Engineer(only W2)
Data engineer job in Columbus, OH
Bachelor's Degree in Computer Science or related technical field AND 5+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, or Java.
Proficiency with Azure data services, such as Azure Data Lake, Azure Data Factory and Databricks.
Expertise using Cloud Security (i.e., Active Directory, network security groups, and encryption services).
Proficient in Python for developing and maintaining data solutions.
Experience with optimizing or managing technology costs.
Ability to build and maintain a data architecture supporting both real-time and batch processing.
Ability to implement industry standard programming techniques by mastering advanced fundamental concepts, practices, and procedures, and having the ability to analyze and solve problems in existing systems.
Expertise with unit testing, integration testing and performance/stress testing.
Database management skills and understanding of legacy and contemporary data modeling and system architecture.
Demonstrated leadership skills, team spirit, and the ability to work cooperatively and creatively across an organization
Experience on teams leveraging Lean or Agile frameworks.
Data Scientist with Hands On development experience with R, SQL & Python
Data engineer job in Columbus, OH
*Per the client, No C2C's!*
Central Point Partners is currently interviewing candidates in the Columbus, Oh area for a large client.
only GC's and USC's.
This position is Hybrid (4 Days onsite)! Only candidates who are local to Columbus, Oh will be considered.
Data Scientist with Hands On development experience with R, SQL & Python
Summary:
Our client is seeking a passionate, data-savvy Senior Data Scientist to join the Enterprise Analytics team to fuel our mission of growth through data-driven insights and opportunity discovery. This dynamic role uses a consultative approach with the business segments to dive into our customer, product, channel, and digital data to uncover opportunities for consumer experience optimization and customer value delivery. You will also enable stakeholders with actionable, intuitive performance insights that provide the business with direction for growth. The ideal candidate will have a robust mix of technical and communication skills, with a passion for optimization, data storytelling, and data visualization. You will collaborate with a centralized team of data scientists as well as teams across the organization including Product, Marketing, Data, Finance, and senior leadership. This is an exciting opportunity to be a key influencer to the company's strategic decisions and to learn and grow with our Analytics team.
Notes from the manager
The skills that will be critical will be Python or R and a firm understanding of SQL along with foundationally understanding what data is needed to perform studies now and in the future. For a high-level summary that should help describe what this person will be asked to do alongside their peers:
I would say this person will balance analysis with development, knowing when to jump in and knowing when to step back to lend their expertise.
Feature & Functional Design
Data scientists are embedded in the team's designing the feature. Their main job here is to define the data tracking needed to evaluate the business case-things like event logging, Adobe tagging, third-party data ingestion, and any other tracking requirements. They are also meant to consult and outline if/when business should be bringing data into the bank and will help connect business with CDAO and IT warehousing and data engineering partners should new data need to be brought forward.
Feature Engineering & Development
The same data scientists stay involved as the feature moves into execution. They support all necessary functions (Amigo, QA, etc.) to ensure data tracking is in place when the feature goes live. They also begin preparing to support launch evaluation and measurement against experimentation design or business case success criteria.
Feature Rollout & Performance Evaluation
Owns tracking the rollout, running A/B tests, and conducting impact analysis for all features that they have been involved in the Feature & Functional Design and Feature Engineering & Development stages. They provide an unbiased view of how the feature performs against the original business case along with making objective recommendations that will provide direction for business. They will roll off once the feature has matured through business case/experiment design and evaluation.
In addition to supporting feature rollouts…
Data scientists on the team are also encouraged to pursue self-driven initiatives during periods when they are not actively supporting other projects. These initiatives may include designing experiments, conducting exploratory analyses, developing predictive models, or identifying new opportunities for impact.
For more information about this opportunity, please contact Bill Hart at ************ AND email your resume to **********************************!
Data Engineer (Databricks)
Data engineer job in Columbus, OH
ComResource is searching for a highly skilled Data Engineer with a background in SQL and Databricks that can handle the design and construction of scalable management systems, ensure that all data systems meet company requirements, and also research new uses for data acquisition.
Requirements:
Design, construct, install, test and maintain data management systems.
Build high-performance algorithms, predictive models, and prototypes.
Ensure that all systems meet the business/company requirements as well as industry practices.
Integrate up-and-coming data management and software engineering technologies into existing data structures.
Develop set processes for data mining, data modeling, and data production.
Create custom software components and analytics applications.
Research new uses for existing data.
Employ an array of technological languages and tools to connect systems together.
Recommend different ways to constantly improve data reliability and quality.
Qualifications:
5+ years data quality engineering
Experience with Cloud-based systems, preferably Azure
Databricks and SQL Server testing
Experience with ML tools and LLMs
Test automation frameworks
Python and SQL for data quality checks
Data profiling and anomaly detection
Documentation and quality metrics
Healthcare data validation experience preferred
Test automation and quality process development
Plus:
Azure Databricks
Azure Cognitive Services integration
Databricks Foundational model Integration
Claude API implementation a plus
Python and NLP frameworks (spa Cy, Hugging Face, NLTK)
Data Engineer
Data engineer job in Dublin, OH
The Data Engineer is a technical leader and hands-on developer responsible for designing, building, and optimizing data pipelines and infrastructure to support analytics and reporting. This role will serve as the lead developer on strategic data initiatives, ensuring scalable, high-performance solutions are delivered effectively and efficiently.
The ideal candidate is self-directed, thrives in a fast-paced project environment, and is comfortable making technical decisions and architectural recommendations. The ideal candidate has prior experience in modern data platforms, most notable Databricks and the “lakehouse” architecture. They will work closely with cross-functional teams, including business stakeholders, data analysts, and engineering teams, to develop data solutions that align with enterprise strategies and business goals.
Experience in the financial industry is a plus, particularly in designing secure and compliant data solutions.
Responsibilities:
Design, build, and maintain scalable ETL/ELT pipelines for structured and unstructured data.
Optimize data storage, retrieval, and processing for performance, security, and cost-efficiency.
Ensure data integrity and governance by implementing robust validation, monitoring, and compliance processes.
Consume and analyze data from the data pipeline to infer, predict and recommend actionable insight, which will inform operational and strategic decision making to produce better results.
Empower departments and internal consumers with metrics and business intelligence to operate and direct our business, better serving our end customers.
Determine technical and behavioral requirements, identify strategies as solutions, and section solutions based on resource constraints.
Work with the business, process owners, and IT team members to design solutions for data and advanced analytics solutions.
Perform data modeling and prepare data in databases for analysis and reporting through various analytics tools.
Play a technical specialist role in championing data as a corporate asset.
Provide technical expertise in collaborating with project and other IT teams, internal and external to the company.
Contribute to and maintain system data standards.
Research and recommend innovative, and where possible automated approaches for system data administration tasks. Identify approaches that leverage our resources and provide economies of scale.
Engineer system that balances and meets performance, scalability, recoverability (including backup design), maintainability, security, high availability requirements and objectives.
Skills:
Databricks and related - SQL, Python, PySpark, Delta Live Tables, Data pipelines, AWS S3 object storage, Parquet/Columnar file formats, AWS Glue.
Systems Analysis - The application of systems analysis techniques and procedures, including consulting with users, to determine hardware, software, platform, or system functional specifications.
Time Management - Managing one's own time and the time of others.
Active Listening - Giving full attention to what other people are saying, taking time to understand the points being made, asking questions as appropriate, and not interrupting at inappropriate times.
Critical Thinking - Using logic and reasoning to identify the strengths and weaknesses of alternative solutions, conclusions or approaches to problems.
Active Learning - Understanding the implications of new information for both current and future problem-solving and decision-making.
Writing - Communicating effectively in writing as appropriate for the needs of the audience.
Speaking - Talking to others to convey information effectively.
Instructing - Teaching others how to do something.
Service Orientation - Actively looking for ways to help people.
Complex Problem Solving - Identifying complex problems and reviewing related information to develop and evaluate options and implement solutions.
Troubleshooting - Determining causes of operating errors and deciding what to do about it.
Judgment and Decision Making - Considering the relative costs and benefits of potential actions to choose the most appropriate one.
Experience and Education:
High School Diploma (or GED or High School Equivalence Certificate).
Associate degree or equivalent training and certification.
5+ years of experience in data engineering including SQL, data warehousing, cloud-based data platforms.
Databricks experience.
2+ years Project Lead or Supervisory experience preferred.
Must be legally authorized to work in the United States. We are unable to sponsor or take over sponsorship at this time.
Data Engineer
Data engineer job in Columbus, OH
We're seeking a skilled Data Engineer based in Columbus, OH, to support a high-impact data initiative. The ideal candidate will have hands-on experience with Python, Databricks, SQL, and version control systems, and be comfortable building and maintaining robust, scalable data solutions.
Key Responsibilities
Design, implement, and optimize data pipelines and workflows within Databricks.
Develop and maintain data models and SQL queries for efficient ETL processes.
Partner with cross-functional teams to define data requirements and deliver business-ready solutions.
Use version control systems to manage code and ensure collaborative development practices.
Validate and maintain data quality, accuracy, and integrity through testing and monitoring.
Required Skills
Proficiency in Python for data engineering and automation.
Strong, practical experience with Databricks and distributed data processing.
Advanced SQL skills for data manipulation and analysis.
Experience with Git or similar version control tools.
Strong analytical mindset and attention to detail.
Preferred Qualifications
Experience with cloud platforms (AWS, Azure, or GCP).
Familiarity with enterprise data lake architectures and best practices.
Excellent communication skills and the ability to work independently or in team environments.
Senior Data Architect
Data engineer job in Marysville, OH
4 days onsite - Marysville, OH
Skillset:
Bachelor's degree in computer science, data science, engineering, or related field
10 years minimum relevant experience in design and implementation of data models (Erwin) for enterprise data warehouse initiatives
Experience leading projects involving cloud data lakes, data warehousing, data modeling, and data analysis
Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS), real-time data distribution (Kinesis, Kafka, Dataflow), and modern data warehouse tools (Redshift, Snowflake, Databricks)
Experience with various database platforms, including DB2, MS SQL Server, PostgreSQL, Couchbase, MongoDB, etc.
Understanding of entity-relationship modeling, metadata systems, and data security, quality tools and techniques
Ability to design traditional/relational and modern big-data architecture based on business needs
Experience with business intelligence tools and technologies such as Informatica, Power BI, and Tableau
Exceptional communication and presentation skills Strong analytical and problem-solving skills
Ability to collaborate and excel in complex, cross-functional teams involving data scientists, business analysts, and stakeholders
Ability to guide solution design and architecture to meet business needs.
Software Engineer
Data engineer job in Columbus, OH
hackajob has partnered with a global technology and management consultancy, specializing in driving transformation across the financial services and energy industries, and we're looking for Java & Python Developers!
Role: Software Engineer (Java & Python)
Mission: This role focuses on a large technology implementation with a major transition of a broker/dealer platform. These resources will support ETL development, API development, and conversion planning.
Location: On-site role in Columbus, OH.
Rates:
W2 - $32 per hour
1099 - $42 per hour
Work authorization: This role requires you to be authorized to work in the United States without sponsorship.
Qualifications (+4 years of experience):
Strong experience with Java, Spring Boot, and microservices architecture.
Proficiency in Python for ETL and automation.
Hands-on experience with API development.
Knowledge of data integration, ETL tools, and conversion workflows.
hackajob is a recruitment platform that matches you with relevant roles based on your preferences. To be matched with the roles, you need to create an account with us.
This role requires you to be based in the US.
Senior Software Engineer
Data engineer job in Columbus, OH
Job Title: Spark 3 Developer
Who We Are:
Vernovis is a Total Talent Solutions company specializing in Technology, Cybersecurity, Finance & Accounting functions. At Vernovis, we help professionals achieve their career goals by matching them with innovative projects and dynamic contract opportunities across Ohio and the Midwest.
Client Overview:
Vernovis is partnering with a leading organization in scientific data management and innovation to modernize its big data platform. This initiative involves transitioning legacy systems, such as Cascading, Hadoop, and MapReduce, to Spark 3, optimizing for scalability and efficiency. As part of this well-established organization, your work will contribute to transforming how big data environments are managed and processed.
What You'll Do:
Legacy Workflow Migration: Lead the conversion of existing Cascading, Hadoop, and MapReduce workflows to Spark 3, ensuring seamless transitions.
Performance Optimization: Utilize Spark 3 features like Adaptive Query Execution (AQE) and Dynamic Partition Pruning to optimize data pipelines.
Collaboration: Work closely with infrastructure teams and stakeholders to ensure alignment with modernization initiatives.
Big Data Ecosystem Integration: Develop solutions that integrate with platforms like Hadoop, Hive, Kafka, and cloud environments (AWS, Azure).
Support Modernization Goals: Contribute to key organizational initiatives focused on next-generation data optimization and modernization.
What Experience You'll Have:
Spark 3 Expertise: 3+ years of experience with Apache Spark, including Spark 3.x development and optimization.
Migration Experience: Proven experience transitioning from Cascading, Hadoop, or MapReduce to Spark 3.
Programming Skills: Proficiency in Scala, Python, or Java.
Big Data Ecosystem: Strong knowledge of Hadoop, Hive, and Kafka.
Performance Tuning: Advanced skills in profiling, troubleshooting, and optimizing Spark jobs.
Cloud Platforms: Familiarity with AWS (EMR, Glue, S3) or Azure (Databricks, Data Lake).
The Vernovis Difference:
Vernovis offers Health, Dental, Vision, Voluntary Short & Long -Term Disability, Voluntary Life Insurance, and 401K.
Vernovis does not accept inquiries from Corp to Corp recruiting companies. Applicants must be currently authorized to work in the United States on a full-time basis and not violate any immigration or discrimination laws.
Vernovis provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.
This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
Senior Robotics Software Engineer
Data engineer job in Dublin, OH
At Apricity Robotics we strive to harness the power of AI and robotics to bring the warmth of hope and healing, revolutionizing healthcare and transforming challenging situations into opportunities for growth and progress. Our robot is improving sonographers' quality of life and improving the quality of ultrasound scans. Our team is differentiated by its expertise in creativity, engineering, and delivering robots with advanced intelligence, dexterity, and care - specifically designed to work alongside people, in existing environments. Every day, we embrace challenging problems, developing new solutions and practical implementations that make robots intuitive and exceptional co-workers. We approach our work with passion, making every day an adventure.
About The Role
We are looking for a seasoned and visionary engineer to join our team as Senior Robotics Software Engineer. This role will lead the development of the core robot software architectures that enable robust, safe, reliable, and intelligent behavior in our robotic platforms. You will define and implement the core infrastructure of our robot software platform, enabling advanced capabilities in kinematics, vision, sensor fusion, and controls. Your solutions will be essential to delivering responsive, capable, and reliable systems that operate around patients and become an extension of sonographers.
What You'll Do
Lead the architecture, development, and performance of the full robotics software stack-including path planning, state estimation, sensor fusion, whole-body control, manipulation, and safety.
Write clear, well-documented code that is easily understood and maintained by other team members.
Ensure seamless integration of perception, planning, and control components to enable robust robot operation in complex, semi-structured environments.
Drive the development of software frameworks that support extensibility, modularity, and scalability across robot platforms and hardware variants.
Collaborate cross-functionally with hardware, applications, product, and systems teams to define and deliver key robot capabilities, performance benchmarks, and feature milestones.
Champion quality, reliability, and real-time performance throughout the robotics stack, with particular emphasis on safety-critical applications.
Set technical strategy, engineering standards, and development processes that enable rapid iteration while ensuring long-term maintainability.
Foster a high-performance, collaborative culture.
Stay ahead of emerging trends in robotics autonomy, controls, and AI, and identify strategic opportunities to adopt and integrate new technologies.
Mentor talented robotics software engineers, fostering their technical and professional development.
About You
Degree in Robotics, Computer Science, Electrical Engineering, Mechanical Engineering, or a related technical discipline.
Proficiency in C++ and Python, with extensive experience designing, developing, and deploying complex robotic systems using ROS2.
5+ years of experience delivering production robotics systems, ideally with real-world deployment at scale or equivalent advanced degree experience.
Deep expertise in one or more of the following areas: whole-body control, model-predictive control, sensor fusion, path planning, manipulation, or collision models.
Proven ability to work with multidisciplinary engineering teams and deliver highly integrated robotic systems.
Excellent communication, collaboration, and executive presentation skills.
Passion for building impactful technology that improves the lives of real users.
Demonstrates high energy, availability, intrinsic motivation, and focus to drive intensely.
Authorization to work indefinitely in the USA.
Bonus Points
Experience delivering robotics systems in medical, manufacturing, or other semi-structured domains.
Familiarity with Medical compliance and/or medical device product development
Prior leadership in building and maintaining safety-critical or real-time software systems.
Familiarity with functional safety practices and certification.
Benefits
Equity: Company stock options.
Insurance Coverage: medical, dental, and vision insurance available
Time Off:
Flexible, unlimited PTO
10 company holidays, including a winter shutdown
Relocation Assistance: Relocation assistance is available to move you near our Dublin, Ohio office.
Unfortunately at this time we're unable to offer sponsorship for this role.
Apricity Robotics is committed to a work environment in which all individuals are treated with respect and dignity. Each individual has the right to work in a professional atmosphere that promotes equal employment opportunities and prohibits unlawful discriminatory practices, including harassment. Therefore, it is the policy of Apricity Robotics to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, age, disability, marital status, citizenship, national origin, genetic information, or any other characteristic protected by law. Apricity Robotics prohibits any such discrimination or harassment.
Senior Servicenow Developer
Data engineer job in Newark, OH
*This is a Direct Hire Fulltime role only US Citizens and Green Card holders are accepted
Update: Seeking candidates willing to travel to Newark when needed, and could be once a week or every other week.
Note: There is a strong possibility we'll be moving this team to our Columbus office in 2026
This Senior ServiceNow Developer position is responsible for analyzing, designing, developing, or implementing, and maintaining ServiceNow applications tailored to specifications and organizational needs. Designs, develops, deploys, and supports custom applications, integrations, and workflows within the ServiceNow platform. Collaborates with architects, developers, and cross-functional teams to deliver and support business solutions.
Responsibilities:
Create and refine prototypes for user testing and feedback analysis.
Review and maintain technical documentation, including architecture diagrams and user guides.
Conduct quality assurance testing to identify and resolve defects or issues.
Troubleshoot and resolve production issues and defects.
Ensure compliance with company policies, technical and security standards, and recommend ServiceNow platform governance.
Mentor other developers, assist in code reviews, and oversee deployments.
Contribute to the evolution of standards and best practices.
Ensure uptime and stability of the ServiceNow platform.
Maintain awareness of and adherence to client's compliance requirements and risk management concepts, expectations, policies and procedures and apply them to daily tasks.
Deliver a consistent, high level of service within our Serving More standards.
Other duties as assigned.
Requirements:
High School diploma or equivalent required
Bachelor's in computer science, software engineering or related field experience preferred
5+ years development experience with ServiceNow
4+ years with ServiceNow modules such as ITSM, ITOM, HRSD, or CSM
Familiarity with JavaScript, HTML, CSS, and other relevant technologies
ServiceNow Application Developer (CAD) and/or ServiceNow System Administrator (CSA) certifications preferred
Git is great, VisioStudio and API integration
Principal Data Scientist : Product to Market (P2M) Optimization
Data engineer job in Groveport, OH
About Gap Inc. Our brands bridge the gaps we see in the world. Old Navy democratizes style to ensure everyone has access to quality fashion at every price point. Athleta unleashes the potential of every woman, regardless of body size, age or ethnicity. Banana Republic believes in sustainable luxury for all. And Gap inspires the world to bring individuality to modern, responsibly made essentials.
This simple idea-that we all deserve to belong, and on our own terms-is core to who we are as a company and how we make decisions. Our team is made up of thousands of people across the globe who take risks, think big, and do good for our customers, communities, and the planet. Ready to learn fast, create with audacity and lead boldly? Join our team.
About the Role
Gap Inc. is seeking a Principal Data Scientist with deep expertise in operations research and machine learning to lead the design and deployment of advanced analytics solutions across the Product-to-Market (P2M) space. This role focuses on driving enterprise-scale impact through optimization and data science initiatives spanning pricing, inventory, and assortment optimization.
The Principal Data Scientist serves as a senior technical and strategic thought partner, defining solution architectures, influencing product and business decisions, and ensuring that analytical solutions are both technically rigorous and operationally viable. The ideal candidate can lead end-to-end solutioning independently, manage ambiguity and complex stakeholder dynamics, and communicate technical and business risk effectively across teams and leadership levels.
What You'll Do
* Lead the framing, design, and delivery of advanced optimization and machine learning solutions for high-impact retail supply chain challenges.
* Partner with product, engineering, and business leaders to define analytics roadmaps, influence strategic priorities, and align technical investments with business goals.
* Provide technical leadership to other data scientists through mentorship, design reviews, and shared best practices in solution design and production deployment.
* Evaluate and communicate solution risks proactively, grounding recommendations in realistic assessments of data, system readiness, and operational feasibility.
* Evaluate, quantify, and communicate the business impact of deployed solutions using statistical and causal inference methods, ensuring benefit realization is measured rigorously and credibly.
* Serve as a trusted advisor by effectively managing stakeholder expectations, influencing decision-making, and translating analytical outcomes into actionable business insights.
* Drive cross-functional collaboration by working closely with engineering, product management, and business partners to ensure model deployment and adoption success.
* Quantify business benefits from deployed solutions using rigorous statistical and causal inference methods, ensuring that model outcomes translate into measurable value
* Design and implement robust, scalable solutions using Python, SQL, and PySpark on enterprise data platforms such as Databricks and GCP.
* Contribute to the development of enterprise standards for reproducible research, model governance, and analytics quality.
Who You Are
* Master's or Ph.D. in Operations Research, Operations Management, Industrial Engineering, Applied Mathematics, or a closely related quantitative discipline.
* 10+ years of experience developing, deploying, and scaling optimization and data science solutions in retail, supply chain, or similar complex domains.
* Proven track record of delivering production-grade analytical solutions that have influenced business strategy and delivered measurable outcomes.
* Strong expertise in operations research methods, including linear, nonlinear, and mixed-integer programming, stochastic modeling, and simulation.
* Deep technical proficiency in Python, SQL, and PySpark, with experience in optimization and ML libraries such as Pyomo, Gurobi, OR-Tools, scikit-learn, and MLlib.
* Hands-on experience with enterprise platforms such as Databricks and cloud environments
* Demonstrated ability to assess, communicate, and mitigate risk across analytical, technical, and business dimensions.
* Excellent communication and storytelling skills, with a proven ability to convey complex analytical concepts to technical and non-technical audiences.
* Strong collaboration and influence skills, with experience leading cross-functional teams in matrixed organizations.
* Experience managing code quality, CI/CD pipelines, and GitHub-based workflows.
Preferred Qualifications
* Experience shaping and executing multi-year analytics strategies in retail or supply chain domains.
* Proven ability to balance long-term innovation with short-term deliverables.
* Background in agile product development and stakeholder alignment for enterprise-scale initiatives.
Benefits at Gap Inc.
* Merchandise discount for our brands: 50% off regular-priced merchandise at Old Navy, Gap, Banana Republic and Athleta, and 30% off at Outlet for all employees.
* One of the most competitive Paid Time Off plans in the industry.*
* Employees can take up to five "on the clock" hours each month to volunteer at a charity of their choice.*
* Extensive 401(k) plan with company matching for contributions up to four percent of an employee's base pay.*
* Employee stock purchase plan.*
* Medical, dental, vision and life insurance.*
* See more of the benefits we offer.
* For eligible employees
Gap Inc. is an equal-opportunity employer and is committed to providing a workplace free from harassment and discrimination. We are committed to recruiting, hiring, training and promoting qualified people of all backgrounds, and make all employment decisions without regard to any protected status. We have received numerous awards for our long-held commitment to equality and will continue to foster a diverse and inclusive environment of belonging. In 2022, we were recognized by Forbes as one of the World's Best Employers and one of the Best Employers for Diversity.
ETL Architect
Data engineer job in Columbus, OH
E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website ******************
Job Description
Title : ETL Architect
Location : Columbus, OH
Type : Fulltime Permanent
Work Status : US Citizen / GC / EAD (GC)
Required Skills:
• Responsible for Architecture, Design and Implementation of Data Integration/ETL, Data Quality, Metadata Management and Data Migration solutions using Informatica tools
• Execute engagements as Data Integration-ETL Architect and define Solution Strategy, Architecture, Design and Implementation approach
• Expertise in implementing Data Integration-ETL solutions which include components such as ETL, Data Migration, Replication, Consolidation, Data Quality, Metadata Management etc. using Informatica products (e.g. Power Center, Power Exchange, IDQ, Metadata Manager)
• Responsible for Detailed ETL design, Data Mapping, Transformation Rules, Interfaces, Database schema, Scheduling, Performance Tuning, etc
• Lead a team of designers/developers and guide them throughout the implementation life cycle and perform Code review
• Engage client Architects, SMEs and other stakeholders throughout Architecture, Design and implementation lifecycle and recommend effective solutions
• experience in multiple Databases such as Oracle, DB2, SQL Server, Mainframe, etc
• Experience in Industry models such as IIW, IAA, ACORD, HL7, etc. and Insurance products (e.g. Guidewire) will be plus
Additional Information
All your information will be kept confidential according to EEO guidelines.
Data Scientist (Mid - Sr) - TRANSMISSION Systems & Asset Health
Data engineer job in New Albany, OH
Job Posting End Date 12-23-2025 Please note the job posting will close on the day before the posting end date. Responsible for conducting intermediate analysis on various data types to uncover hidden patterns and unknown correlations to support business and management decisions. This includes data mining, data auditing, aggregation, validation, and reconciliation. Build reports for management as well as conducts basic hypothesis tests (contextualize basic levels of complexity). Utilizes analytics and statistical software such as SQL, R, Python, Excel, Hadoop, Tableau, SAS, SPSS, and others to perform analysis and interpret data. Create visual dashboards.
Job Description
More Specific to This Opportunity:
All operating companies (OPCO's) with a focus on:
* EHV Transformer Asset failure prevention, CCVT, CB, and analysis
* Focus on Asset Performance Management application (APM) data accuracy and mapping for new AEP specialized health scores
Operational Excellence: AEP-Wide:
* Proactive data analysis work focused around CCVT and CB asset failures using NEAC and other methods, and advance alarm analysis throughout the entire AEP footprint leveraging AI solutions in the future.
Financial Strength: AEP-Wide:
* Asset Health Monitoring that supports Condition-Based Maintenance and Condition Based Renewal activities and creating solutions around predictive long term Asset Health Scores.
Customer Service / Operational Excellence: AEP-Wide:
* The data scientist will lead the proactive grid monitoring through Power Quality (PQ) data analytics of Large Load data centers, cryptocurrency facilities, AI training facilities, and IBRs for LFOs induced into the system, along with providing subsequent data analysis and monitoring for impacted areas and customers.
* Supporting the development of Distribution focused Wavewin relay oscillography data and Distribution NEAC.
* Support the expansion of APM application to the OPCO's to use to monitor long-term health and performance of Distribution assets.
* Support the expansion of Asset Health Monitoring to Renewable Generation.
* Support background system performance data analytics, highlighting harmonics and frequencies before and after large load connections to the grid, supports large load customers and AEP in obtaining the insights and information necessary to engineer locations accurately.
WHAT YOU'LL DO
Develop moderately complex analytical tools to solve high-value business problems across AEP. Proficiently utilize analytical tools and data management skills to provide actionable business intelligence.
Continue building and acquiring business knowledge across the organization and utilize knowledge to perform meaningful analyses. Identify and acquire data pertinent to solving business problems.
Develop components of predictive and prescriptive systems to optimize operations and support strategic initiatives.
Apply data visualization tools to large data sets to enhance data understanding. Perform research to keep abreast of industry practices.
Utilize inquiry skills to obtain a clear understanding of business processes and needs.
Work with stakeholders to determine value that additional information derived from proper analysis could provide.
Collaborate with technology partners to develop data acquisition and data management practices.
Develop concise, meaningful written reports and provide oral presentation. Develop reporting dashboards that meet business needs for status, trends, and variances.
Collaborate with technology and business units to operationalize meaningful reports and dashboards of industry practices.
Consult with business unit partners to identify, prioritize, and develop actionable insights, providing analyses across a variety of core business areas.
WHAT WE'RE LOOKING FOR
Based on education, experience, interview and internal equity, this opportunity will be filled commensurately at either of these titles. For this posting, minimum requirements are stated at the lower grade. Increased expectations are at the higher grade.
Data Scientist Sr. (grade 8): base salary = $99K - $128K
Data Scientist (mid-level) (grade 7): base salary = $88K - $109K
Education: Bachelor's degree in mathematics, operations research, statistics, economics, data science, computer science or related technical field is required.
Experience: two to five (2-5) years of relevant work experience is required. An equivalent combination of education and related experience may be considered.
OTHER REQUIREMENTS:
Experience with analytics packages, such as SAS, R, SPSS
WHAT YOU'LL GET
Base Salary: $88K - $128K
In addition to base salary, AEP offers competitive Total Rewards including: discretionary short-/long-term incentives, 401(k), pension, health insurance, vacation, educational assistance, etc.
WHO WE ARE
At AEP, we're more than just an energy company - we're a team of dedicated professionals committed to delivering safe, reliable, and innovative energy solutions. Guided by our mission to put the customer first, we strive to exceed expectations by listening, responding, and continuously improving the way we serve our communities. If you're passionate about making a meaningful impact and being part of a forward-thinking organization, this is the company for you!
AMERICAN ELECTRIC POWER (on-site)
$88K - $128K / Year
#AEPCareers #LI-ONSITE
Compensation Data
Compensation Grade:
SP20-007
Compensation Range:
$85,081.00 - $124,940.00
The Physical Demand Level for this job is: S - Sedentary Work: Exerting up to 10 pounds of force occasionally (Occasionally: activity or condition exists up to 1/3 of the time) and/or a negligible amount of force frequently. (Frequently: activity or condition exists from 1/3 to 2/3 of the time) to lift, carry, push, pull or otherwise move objects, including the human body. Sedentary work involves sitting most of the time but may involve walking or standing for brief periods of time. Jobs are sedentary if walking and standing are required only occasionally, and all other sedentary criteria are met.
It is hereby reaffirmed that it is the policy of American Electric Power (AEP) to provide Equal Employment Opportunity in all respects of the employer-employee relationship including recruiting, hiring, upgrading and promotion, conditions and privileges of employment, company sponsored training programs, educational assistance, social and recreational programs, compensation, benefits, transfers, discipline, layoffs and termination of employment to all employees and applicants without discrimination because of race, color, religion, sex (including pregnancy, gender identity, and sexual orientation), national origin, age, veteran or military status, disability, genetic information, or any other basis prohibited by applicable law. When required by law, we might record certain information or applicants for employment may be invited to voluntarily disclose protected characteristics.
Data Scientist
Data engineer job in Delaware, OH
The Data Scientist will be responsible to create product reliability models and advanced analytics that drive strategic decisions about product improvements. This role will collaborate closely with cross-functional teams, including Engineering, Product Management, Quality, Services, and IT, to develop and deploy data-driven solutions that address complex customer challenges. You should understand the impacts of environmental and other field conditions as they relate to product reliability. In this role, you should be able to apply mathematical and statistical methods to predict future field performance by building product reliability models using software tools.
PRINCIPAL DUTIES & RESPONSIBILITIES:
* Analyze product reliability requirements
* Create predictive models for field performance and product reliability
* Assist in design of experiments and analysis to understand impacts of different design decisions and test results
* Correlate predictive models with test results and field data
REQUIREMENTS:
* Bachelor's Degree in Math, Statistics, Data Science, Computer Science, Reliability Engineering, or equivalent experience
* 1-5 years of experience
* Basic knowledge in power/electrical engineering; AC and DC power
* Basic knowledge in large- and small-scale cooling systems
* LabVIEW, Python, or other coding and modeling language
* Minitab or other statistical software tools
* Power BI or other data visualization tools
* Travel 15%
Auto-ApplyData Scientist - Wealth Management
Data engineer job in Columbus, OH
Leverage your technical expertise to shape innovative solutions and align capabilities to solve real-world challenges! As a Data Science Associate on the Online Investing Data Science Team for US Wealth Management ("USWM"), your ability to blend cross-functional knowledge in Online Investing, financial products, markets, statistics and advanced analytics will be a key to your success. You'll utilize strong communication skills to effectively operate in direct collaboration with key stakeholders across functions and lines-of-business.
**Job Responsibilities:**
+ Analyze client, macro-behavioral trends as it relates to Online Investing, financial products and solutions that are offered on JPMorgan's USWM platform
+ Provide analytical support for Online Investing strategy and platform engagement
+ Analyze client behavioral data to inform strategic decision making, including forming, investigating and researching hypotheses, as well as deep-dives into client deepening and retention efforts.
+ Use a variety of data tools and technologies, including industry-standard and open-source platforms, to assemble, organize and analyze data, including Python, R and SQL.
+ Apply statistical, causal inference and machine learning modeling methods and techniques, including clustering algorithms, tree-based models, matching methods and time-series anomaly detection.
+ Present findings to key stakeholders, including senior leaders, across Online Investing, Finance, Legal and Data and Analytics.
**Required qualifications, capabilities, and skills:**
+ 1+ years of experience working in an analytical, data science, finance or management/economic consulting role
+ BS Degree in a related field, such as Mathematics, Statistics, Engineering, Computer Science, Finance, Economics or other applicable STEM discipline
+ Adept at explaining and converting complex concepts into digestible information to be consumed by audiences at various levels of the business
+ Strong storytelling capabilities
+ Aptitude for learning new theory and new technology
**Preferred qualifications, capabilities, and skills:**
+ Masters Degree or PhD
+ Knowledge or demonstrable interest in financial securities and markets
+ Practical understanding of advanced statistics, econometrics and / or machine learning
+ Practical understanding of financial modeling and economics
Chase is a leading financial services firm, helping nearly half of America's households and small businesses achieve their financial goals through a broad range of financial products. Our mission is to create engaged, lifelong relationships and put our customers at the heart of everything we do. We also help small businesses, nonprofits and cities grow, delivering solutions to solve all their financial needs.
We offer a competitive total rewards package including base salary determined based on the role, experience, skill set and location. Those in eligible roles may receive commission-based pay and/or discretionary incentive compensation, paid in the form of cash and/or forfeitable equity, awarded in recognition of individual achievements and contributions. We also offer a range of benefits and programs to meet employee needs, based on eligibility. These benefits include comprehensive health care coverage, on-site health and wellness centers, a retirement savings plan, backup childcare, tuition reimbursement, mental health support, financial coaching and more. Additional details about total compensation and benefits will be provided during the hiring process.
We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation.
Equal Opportunity Employer/Disability/Veterans
Senior Data Engineer
Data engineer job in Columbus, OH
Here at Lower, we believe homeownership is the key to building wealth, and we're making it easier and more accessible than ever. As a mission-driven fintech, we simplify the home-buying process through cutting-edge technology and a seamless customer experience.
With tens of billions in funded home loans and top ratings on Trustpilot (4.8), Google (4.9), and Zillow (4.9), we're a leader in the industry. But what truly sets us apart? Our people. Join us and be part of something bigger.
Job Description:
We are seeking a Senior Data Engineer to play a key role in building and optimizing our data infrastructure to support business insights and decision-making. In this role, you will design and enhance denormalized analytics tables in Snowflake, build scalable ETL pipelines, and ensure data from diverse sources is transformed into accurate, reliable, and accessible formats. You will collaborate with business and sales stakeholders to gather requirements, partner with developers to ensure critical data is captured at the application level, and optimize existing frameworks for performance and integrity. This role also includes creating robust testing frameworks and documentation to ensure quality and consistency across data pipelines.
What you'll do:
Data Pipeline Engineering:
Design, develop, and optimize high-performance ETL/ELT pipelines using Python, dbt, and Snowflake.
Build and manage real-time ingestion pipelines leveraging AWS Lambda and CDC systems.
Cloud & Infrastructure:
Develop scalable serverless solutions with AWS, adopting event-driven architecture patterns.
Manage containerized applications using Docker and infrastructure as code via GitHub Actions.
Advanced Data Management:
Create sophisticated, multi-layered Snowflake data models optimized for scalability, flexibility, and performance.
Integrate and manage APIs for Salesforce, Braze, and various financial systems, emphasizing robust error handling and reliability.
Quality Assurance & Operations:
Implement robust testing frameworks, data lineage tracking, monitoring, and alerting.
Enhance and manage CI/CD pipelines, drive migration to modern orchestration tools (e.g., Dagster, Airflow), and manage multi-environment deployments.
Who you are:
5+ years of data engineering experience, ideally with cloud-native architectures.
Expert-level Python skills, particularly with pandas, SQLAlchemy, and asynchronous processing.
Advanced SQL and Snowflake expertise, including stored procedures, external stages, performance tuning, and complex query optimization.
Strong proficiency with dbt, including macro development, testing, and automated deployments.
Production-grade Pipeline Experience specifically with Lambda, S3, API Gateway, and IAM.
Proven experience with REST APIs, authentication patterns, and handling complex data integrations.
Preferred Experience
Background in financial services or fintech, particularly loan processing, customer onboarding, or compliance.
Experience with real-time streaming platforms like Kafka or Kinesis.
Familiarity with Infrastructure as Code tools (Terraform, CloudFormation).
Knowledge of BI and data visualization tools (Tableau, Looker, Domo).
Container orchestration experience (ECS, Kubernetes).
Understanding of data lake architectures and Delta Lake.
Technical Skills
Programming: Python (expert), SQL (expert), Bash scripting.
Cloud: AWS (Lambda, S3, API Gateway, CloudWatch, IAM).
Data Warehouse: Snowflake, dimensional modeling, query optimization.
ETL/ELT: dbt, pandas, custom Python workflows.
DevOps: GitHub Actions, Docker, automated testing.
APIs: REST integration, authentication, error handling.
Data Formats: JSON, CSV, Parquet, Avro.
Version Control: Git, GitHub workflows.
What Sets You Apart
Systems Thinking: You see the big picture, designing data flows that scale and adapt with the business.
Problem Solver: You quickly diagnose and resolve complex data issues across diverse systems and APIs.
Quality Advocate: You write comprehensive tests, enforce data quality standards, and proactively prevent data issues.
Collaborative: You thrive working alongside analysts, developers, and product teams, ensuring seamless integration and teamwork.
Continuous Learner: You actively seek emerging data technologies and best practices to drive innovation.
Business Impact: You understand how your data engineering decisions directly influence and drive business outcomes.
Benefits & Perks
Competitive salary and comprehensive benefits (healthcare, dental, vision, 401k match)
Hybrid work environment (primarily remote, with two days a week in downtown Columbus Ohio
Professional growth opportunities and internal promotion pathways
Collaborative, mission-driven culture recognized as a local and national "best place to work"
If you don't think you meet all of the criteria below but still are interested in the job, please apply. Nobody checks every box, and we're looking for someone excited to join the team.
Lower provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
Privacy Policy
Auto-ApplyJava Software Engineer
Data engineer job in Columbus, OH
Title: Java Software Engineer
Hire Type: 12 month contract to start (potential extensions and full time hire)
Pay Range: $50/hr - $65/hr (contingent on years of experience, skills, and education)
Required Skills & Experience
Strong programming skills within Java
Jenkins experience for automating builds, CI/CD, and pipeline orchestration
experience withing in AWS environment with some exposure to cloud development
experience with event driven architecture
Job Description
Insight Global is looking for a Java Software Engineer to sit in Columbus, Ohio. This candidate will be aligned to a platform automation project within their internal ERP system. Automation efforts will be assigned to internal developers, and this resource will be working within the middle tier of their internal system. The current code is written in .NET framework, but the new code being developed will be Java based. Candidates will be working with various teams and specifically aligned to their Billing Portal within the internal system focusing on the code for transitions in the middle tier to the customer/client facing tier and back office functions. Candidates need to have worked in an AWS environment and have some exposure to event driven architecture (General structure).