Cloud Engineer
Requirements engineer job in Columbus, OH
We're looking for very strong candidates, as the interview process for this team is extremely rigorous and highly selective.
GC/USC/GC-EAD ONLY- Must provide valid LinkedIn and reference upon request.
5 days on site in Jersey City
We have an opportunity for an Azure Cloud Developer to lead infrastructure provisioning and CI/CD integration efforts. This role is suited for candidates with proven experience in enterprise cloud operations and a strong grasp of automation, Terraform, and Azure services.
Key Responsibilities
Manage and support CI/CD pipelines across Azure environments.
Use Terraform to provision, configure, and maintain Azure infrastructure.
Execute deployment runbooks and adjust infrastructure as needed.
Work with internal tools (e.g., Jet and Jewels) to manage deployments.
Administer core Azure services: compute, networking, storage, and identity.
Support Azure messaging systems such as Event Grid, Event Hubs, and Service Bus.
Collaborate with cross-functional teams to support deployment readiness.
Required Qualifications
5-8 years of experience with Azure cloud infrastructure and Terraform.
Strong knowledge of Azure services, including compute, networking, storage, and IAM.
Hands-on experience managing CI/CD pipelines in enterprise environments.
Ability to interpret and execute operational runbooks independently.
Familiarity with internal DevOps systems such as Jet and Jewels.
Solid scripting skills in Python or similar languages. Preferred Qualifications
Experience with enterprise-grade Azure environments and large-scale infrastructure.
Proficiency in Git-based workflows and CI/CD platforms such as Azure DevOps or GitHub Actions.
Understanding of security, governance, and compliance in cloud deployments.
Certifications such as AZ-104, AZ-204, or AZ-400 are preferred.
Scada Ignition Engineer
Remote requirements engineer job
SCADA Engineer will be responsible for providing leadership and technical expertise in design, development and delivery of Hanwha Convergence SCADA/PPC solutions for the renewable energy industry. He or She will design, develop work packages, troubleshoot, and continuously improve the SCADA system including RTUs, RTACs, HMI, and electrical control systems on large scale PV and/or BESS projects. He or She also will conduct applicable tests and commissioning complying with local/international codes and standards.
**Attention external recruitment firms, we will not accept any unsolicited resumes at this time. Please do not contact any internal member of our company to discuss the position or to solicit candidates. **
DUTIES:
· Lead and manage the assigned projects with available resources for successful projects completion in a due date and a budget.
· Provide project status reports to stakeholders, and support risk mitigation measures as needed to maintain project goals and objectives.
· Lead the development of monitoring and control systems for utility scale renewable energy projects including but not limited to: Solar PV, Battery Energy Storage Systems.
· Provide team oversight in the development of device points lists, IP address lists, Logic Diagrams, HMI mockups & assets, commissioning test plans and completion checklists, utilizing company defined documentation and standards.
· Work within a team environment to define and implement product design standards and best practices that align with company goals and objectives.
· Program and commission PPC, SCADA servers, data historians, and HMI systems.
· Develop engineering work packages, construction work packages, inspection and test procedures, FAT/SAT, commissioning, and operation and maintenance procedures.
· Identify applicable standards and collateral standards for the diverse applicable sites.
· Lead any design changes required to ensure standards compliance or continuous improvement.
· Perform technical presentations to clients including SCADA, PPC(Plant Power Control), and HEIS(Hanwha Energy Integration System) but not limited.
· Mentor and train the less experienced engineers and technicians.
· Conduct/facilitate risk analysis activities as required.
· Perform other duties and/or tasks as required.
SKILLS/EXPERIENCE/EDUCATION
· Bachelor's degree in electrical, electronic, or computer engineering preferred.
· Minimum 2+ years' direct experience in Ignition SCADA application, and other SCADA application engineering experience considered as an asset.
· Schweitzer Engineering RTAC Platform experience considered as an asset.
· Strong knowledge of design, installation and commissioning of SCADA networks using; Fiber Optics, Serial RS-232 / RS-485, Ethernet TCP/IP, MQTT.
· Strong knowledge of industrial automation protocols including but not limited to; Modbus RTU/TCP, DNP3, OPC UA and DA.
· Proficiency in reading and developing diagrams and schematics including but not limited to, power system, networking and control, electrical, mechanical and civil layouts.
· Ability to solve problems and identify root causes as a part of investigation.
· In-depth understanding of power plant operating procedures and control system interaction with governing bodies such as: Regional Compliance Entities, Independent System Operators (CAISO, ERCOT experience preferred), Transmission Operators, and Generator Operators.
LANGUAGE SKILLS:
· Ability to communicate effectively in English.
· Communication in Korean is considered as an asset.
WORK ENVIRONMENT:
· This position can be offered with work from home. However, it's preferred to be at the office at Georgetown, TX and the candidates to be hired may be eligible for relocation assistance
· Fast paced with priorities that often change to meet current priorities.
· Travel to customer sites is required, and the ability to travel internationally with a valid passport.
· Must be legally entitled to work in the USA and prepared to travel abroad.
Hanwha Convergence is proud to be an at-will Equal Opportunity Employer and prohibits discrimination against race, color, religion, sex, sexual orientation, gender perception or identity, national origin, age, marital status, pregnancy, citizenship, disability, protected veteran status and any other classification protected by applicable federal, state or local law. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
We are committed to the full inclusion of all qualified individuals. As part of this commitment, Hanwha Convergence will provide reasonable accommodations to all qualified individuals with disabilities to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment. Please contact us to request accommodations.
Nothing in this statement shall imply implicitly or explicitly a guarantee of employment outside our at-will employment opportunity.
You may view your privacy rights by reviewing Hanwha Convergence Privacy Policy here or contacting our HR Team for a copy.
Data Engineer
Remote requirements engineer job
This is a fully remote 12+ month contract position. No C2C or 3rd party candidates will be considered.
Data Engineer (AI & Automation)
We are seeking a Data Engineer with hands-on experience using AI-driven tools to support automation, system integrations, and continuous process improvement across internal business systems. This role will focus on building and maintaining scalable data pipelines, enabling intelligent workflows, and improving data accessibility and reliability.
Key Responsibilities
Design, build, and maintain automated data pipelines and integrations across internal systems
Leverage AI-enabled tools to streamline workflows and drive process improvements
Develop and orchestrate workflows using Apache Airflow and n8n AI
Model, transform, and optimize data in Snowflake and Azure SQL Data Warehouse
Collaborate with business and technical teams to identify automation opportunities
Ensure data quality, reliability, and performance across platforms
Required Qualifications
Experience as a Data Engineer or similar role
Hands-on experience with Apache Airflow and modern workflow orchestration tools
Strong experience with Snowflake and Azure SQL Data Warehouse
Familiarity with AI-driven automation and integration tools (e.g., n8n AI)
Strong SQL skills and experience building scalable data pipelines
Preferred Qualifications
Experience integrating multiple internal business systems
Background in process improvement or operational automation
Experience working in cloud-based data environments (Azure preferred)
Senior Data Engineer
Requirements engineer job in Columbus, OH
Our direct client has a long-term contract need for a Sr. Data Engineer.
Candidate Requirements:
Candidates must be local to Columbus, Ohio
Candidates must be willing and able to work the following:
Hybrid schedule (3 days in office & 2 days WFH)
The team is responsible for the implementation of the new Contract Management System (FIS Asset Finance) as well as the integration into the overall environment and the migration of data from the legacy contract management system to the new system.
Candidate will be focused on the delivery of data migration topics to ensure that high quality data is migrated from the legacy systems to the new systems. This may involve data mapping, SQL development and other technical activities to support Data Migration objectives.
Must Have Experience:
Strong C# and SQL Server design and development skills. Analysis Design. IMPORTANT MUST HAVE!
Strong technical analysis skills
Strong collaboration skills to work effectively with cross-functional teams
Exceptional ability to structure, illustrate, and communicate complex concepts clearly and effectively to diverse audiences, ensuring understanding and actionable insights.
Demonstrated adaptability and problem-solving skills to navigate challenges and uncertainties in a fast-paced environment.
Strong prioritization and time management skills to balance multiple projects and deadlines in a dynamic environment.
In-depth knowledge of Agile methodologies and practices, with the ability to adapt and implement Agile principles in testing and delivery processes.
Nice to have:
ETL design and development; data mapping skills and experience; experience executing/driving technical design and implementation topics
Senior Data Engineer
Requirements engineer job in Columbus, OH
Responsible for understanding, preparing, processing, and analyzing data to make it valuable and useful for operations decision support.
Accountabilities in this role include:
Partnering with Business Analysis and Analytics teams.
Demonstrating problem-solving ability for effective and timely resolution of system issues, including production outages.
Developing and supporting standard processes to harvest data from various sources and perform data blending to develop advanced data sets, analytical cubes, and data exploration.
Utilizing queries, data exploration and transformation, and basic statistical methods.
Creating Python scripts.
Developing Microsoft SQL Server Integration Services Workflows.
Building Microsoft SQL Server Analysis Services Tabular Models.
Focusing on SQL database work with a blend of strong technical and communication skills.
Demonstrating ability to learn and navigate in large complex environments.
Exhibiting Excel acumen to develop complex spreadsheets, formulas, create macros, and understand VBA code within the modules.
Required Skills:
Experience with MS SQL
Proficiency in Python
Desired Skills:
Experience with SharePoint
Advanced Excel Skills (formulas, VBA, Power Pivot, Pivot Table)
Senior Data Engineer.
Requirements engineer job in Columbus, OH
Immediate need for a talented Senior Data Engineer. This is a 06+ months contract opportunity with long-term potential and is located in Columbus, OH(Remote). Please review the job description below and contact me ASAP if you are interested.
Job ID: 25-95277
Pay Range: $70 - $71 /hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
Working with Marketing data partners and build data pipelines to automate the data feeds from the partners to internal systems on Snowflake.
Working with Data Analysts to understand their data needs and prepare the datasets for analytics.
Work with Data Scientists to build the infrastructure to deploy the models, monitor the performance, and build the necessary audit infrastructure.
Key Requirements and Technology Experience:
Key skills; Snowflake, Python and AWS
Experience with building data pipelines, data pipeline infrastructure and related tools and environments used in analytics and data science (ex: Python, Unix)
Experience in developing analytic workloads with AWS Services, S3, Simple Queue Service (SQS), Simple Notification Service (SNS), Lambda, EC2, ECR and Secrets Manager.
Strong proficiency in Python, SQL, Linux/Unix shell scripting, GitHub Actions or Docker, Terraform or CloudFormation, and Snowflake.
Order of Importance: Terraform, Docker, GitHub Actions OR Jenkins
Experience with orchestration tools such as Prefect, DBT, or Airflow.
Experience automating data ingestion, processing, and reporting/monitoring.
Experience with other relevant tools used in data engineering (e.g., SQL, GIT, etc.)
Ability to set up environments (Dev, QA, and Prod) using GitHub repo and GitHub rules/methodologies; how to maintain (via SQL coding and proper versioning)
Our client is a leading Insurance Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, colour, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.
By applying to our jobs, you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Data Engineer- ETL/ELT - Hybrid/Remote
Remote requirements engineer job
Crown Equipment Corporation is a leading innovator in world-class forklift and material handling equipment and technology. As one of the world's largest lift truck manufacturers, we are committed to providing the customer with the safest, most efficient and ergonomic lift truck possible to lower their total cost of ownership.
Indefinite US Work Authorization Required.
Primary Responsibilities
Design, build and optimize scalable data pipelines and stores.
Clean, prepare and optimize data for consumption in applications and analytics platforms.
Participate in peer code reviews to uphold internal standards.
Ensure procedures are thoroughly tested before release.
Write unit tests and record test results.
Detect, define and debug programs whenever problems arise.
Provide training to users and knowledge transfer to support personnel and other staff members as required.
Prepare system and programming documentation in accordance with internal standards.
Interface with users to extract functional needs and determine requirements.
Conduct detailed systems analysis to define scope and objectives and design solutions.
Work with Business Analyst to help develop and write system requirements.
Establish project plans and schedules and monitor progress providing status reports as required.
Qualifications
Bachelor's degree in Computer Science, Software/Computer Engineering, Information Systems, or related field is required.
4+ years' experience in SQL, ETL, ELT and SAP Data is required.
Python, Databricks, Snowflakes experience preferred.
Strong written, verbal, analytical and interpersonal skills are necessary.
Remote Work: Crown offers hybrid remote work for this position. A reasonable commute is necessary as some onsite work is required. Relocation assistance is available.
Work Authorization:
Crown will only employ those who are legally authorized to work in the United States. This is not a position for which sponsorship will be provided. Individuals with temporary visas or who need sponsorship for work authorization now or in the future, are not eligible for hire.
No agency calls please.
Compensation and Benefits:
Crown offers an excellent wage and benefits package for full-time employees including Health/Dental/Vision/Prescription Drug Plan, Flexible Benefits Plan, 401K Retirement Savings Plan, Life and Disability Benefits, Paid Parental Leave, Paid Holidays, Paid Vacation, Tuition Reimbursement, and much more.
EOE Veterans/Disabilities
Junior Data Engineer
Requirements engineer job in Columbus, OH
Contract-to-Hire
Columbus, OH (Hybrid)
Our healthcare services client is looking for an entry-level Data Engineer to join their team. You will play a pivotal role in maintaining and improving inventory and logistics management programs. Your day-to-day work will include leveraging machine learning and open-source technologies to drive improvements in data processes.
Job Responsibilities
Automate key processes and enhance data quality
Improve injection processes and enhance machine learning capabilities
Manage substitutions and allocations to streamline product ordering
Work on logistics-related data engineering tasks
Build and maintain ML models for predictive analytics
Interface with various customer systems
Collaborate on integrating AI models into customer service
Qualifications
Bachelor's degree in related field
0-2 years of relevant experience
Proficiency in SQL and Python
Understanding of GCP/BigQuery (or any cloud experience, basic certifications a plus).
Knowledge of data science concepts.
Business acumen and understanding (corporate experience or internship preferred).
Familiarity with Tableau
Strong analytical skills
Attitude for collaboration and knowledge sharing
Ability to present confidently in front of leaders
Why Should You Apply?
You will be part of custom technical training and professional development through our Elevate Program!
Start your career with a Fortune 15 company!
Access to cutting-edge technologies
Opportunity for career growth
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
Senior Data Engineer(only W2)
Requirements engineer job in Columbus, OH
Bachelor's Degree in Computer Science or related technical field AND 5+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, or Java.
Proficiency with Azure data services, such as Azure Data Lake, Azure Data Factory and Databricks.
Expertise using Cloud Security (i.e., Active Directory, network security groups, and encryption services).
Proficient in Python for developing and maintaining data solutions.
Experience with optimizing or managing technology costs.
Ability to build and maintain a data architecture supporting both real-time and batch processing.
Ability to implement industry standard programming techniques by mastering advanced fundamental concepts, practices, and procedures, and having the ability to analyze and solve problems in existing systems.
Expertise with unit testing, integration testing and performance/stress testing.
Database management skills and understanding of legacy and contemporary data modeling and system architecture.
Demonstrated leadership skills, team spirit, and the ability to work cooperatively and creatively across an organization
Experience on teams leveraging Lean or Agile frameworks.
Data Engineer
Requirements engineer job in Columbus, OH
We're seeking a skilled Data Engineer based in Columbus, OH, to support a high-impact data initiative. The ideal candidate will have hands-on experience with Python, Databricks, SQL, and version control systems, and be comfortable building and maintaining robust, scalable data solutions.
Key Responsibilities
Design, implement, and optimize data pipelines and workflows within Databricks.
Develop and maintain data models and SQL queries for efficient ETL processes.
Partner with cross-functional teams to define data requirements and deliver business-ready solutions.
Use version control systems to manage code and ensure collaborative development practices.
Validate and maintain data quality, accuracy, and integrity through testing and monitoring.
Required Skills
Proficiency in Python for data engineering and automation.
Strong, practical experience with Databricks and distributed data processing.
Advanced SQL skills for data manipulation and analysis.
Experience with Git or similar version control tools.
Strong analytical mindset and attention to detail.
Preferred Qualifications
Experience with cloud platforms (AWS, Azure, or GCP).
Familiarity with enterprise data lake architectures and best practices.
Excellent communication skills and the ability to work independently or in team environments.
Data Engineer
Remote requirements engineer job
We are looking for a Data Engineer in Austin, TX (fully remote - MUST work CST hours).
Job Title: Data Engineer
Contract: 12 Months
Hourly Rate: $75- $82 per hour (only on W2)
Additional Notes:
Fully remote - MUST work CST hours
SQL, Python, DBT, Utilize geospatial data tools (PostGIS, ArcGIS/ArcPy, QGIS, GeoPandas, etc.) to optimize and normalize spatial data storage, run spatial queries and processes to power analysis and data products
Design, create, refine, and maintain data processes and pipelines used for modeling, analysis, and reporting using SQL (ideally Snowflake and PostgreSQL), Python and pipeline and transformation tools like Airflow and dbt
• Conduct detailed data research on internal and external geospatial data (POI, geocoding, map layers, geometrics shapes), identify changes over time and maintain geospatial data (shape files, polygons and metadata)
• Operationalize data products with detailed documentation, automated data quality checks and change alerts
• Support data access through various sharing platforms, including dashboard tools
• Troubleshoot failures in data processes, pipelines, and products
• Communicate and educate consumers on data access and usage, managing transparency in metric and logic definitions
• Collaborate with other data scientists, analysts, and engineers to build full-service data solutions
• Work with cross-functional business partners and vendors to acquire and transform raw data sources
• Provide frequent updates to the team on progress and status of planned work
About us:
Harvey Nash is a national, full-service talent management firm specializing in technology positions. Our company was founded with a mission to serve as the talent partner of choice for the information technology industry.
Our company vision has led us to incredible growth and success in a relatively short period of time and continues to guide us today. We are committed to operating with the highest possible standards of honesty, integrity, and a passionate commitment to our clients, consultants, and employees.
We are part of Nash Squared Group, a global professional services organization with over forty offices worldwide.
For more information, please visit us at ******************************
Harvey Nash will provide benefits please review: 2025 Benefits -- Corporate
Regards,
Dinesh Soma
Recruiting Lead
Data Engineer
Requirements engineer job in Dublin, OH
The Data Engineer is a technical leader and hands-on developer responsible for designing, building, and optimizing data pipelines and infrastructure to support analytics and reporting. This role will serve as the lead developer on strategic data initiatives, ensuring scalable, high-performance solutions are delivered effectively and efficiently.
The ideal candidate is self-directed, thrives in a fast-paced project environment, and is comfortable making technical decisions and architectural recommendations. The ideal candidate has prior experience in modern data platforms, most notable Databricks and the “lakehouse” architecture. They will work closely with cross-functional teams, including business stakeholders, data analysts, and engineering teams, to develop data solutions that align with enterprise strategies and business goals.
Experience in the financial industry is a plus, particularly in designing secure and compliant data solutions.
Responsibilities:
Design, build, and maintain scalable ETL/ELT pipelines for structured and unstructured data.
Optimize data storage, retrieval, and processing for performance, security, and cost-efficiency.
Ensure data integrity and governance by implementing robust validation, monitoring, and compliance processes.
Consume and analyze data from the data pipeline to infer, predict and recommend actionable insight, which will inform operational and strategic decision making to produce better results.
Empower departments and internal consumers with metrics and business intelligence to operate and direct our business, better serving our end customers.
Determine technical and behavioral requirements, identify strategies as solutions, and section solutions based on resource constraints.
Work with the business, process owners, and IT team members to design solutions for data and advanced analytics solutions.
Perform data modeling and prepare data in databases for analysis and reporting through various analytics tools.
Play a technical specialist role in championing data as a corporate asset.
Provide technical expertise in collaborating with project and other IT teams, internal and external to the company.
Contribute to and maintain system data standards.
Research and recommend innovative, and where possible automated approaches for system data administration tasks. Identify approaches that leverage our resources and provide economies of scale.
Engineer system that balances and meets performance, scalability, recoverability (including backup design), maintainability, security, high availability requirements and objectives.
Skills:
Databricks and related - SQL, Python, PySpark, Delta Live Tables, Data pipelines, AWS S3 object storage, Parquet/Columnar file formats, AWS Glue.
Systems Analysis - The application of systems analysis techniques and procedures, including consulting with users, to determine hardware, software, platform, or system functional specifications.
Time Management - Managing one's own time and the time of others.
Active Listening - Giving full attention to what other people are saying, taking time to understand the points being made, asking questions as appropriate, and not interrupting at inappropriate times.
Critical Thinking - Using logic and reasoning to identify the strengths and weaknesses of alternative solutions, conclusions or approaches to problems.
Active Learning - Understanding the implications of new information for both current and future problem-solving and decision-making.
Writing - Communicating effectively in writing as appropriate for the needs of the audience.
Speaking - Talking to others to convey information effectively.
Instructing - Teaching others how to do something.
Service Orientation - Actively looking for ways to help people.
Complex Problem Solving - Identifying complex problems and reviewing related information to develop and evaluate options and implement solutions.
Troubleshooting - Determining causes of operating errors and deciding what to do about it.
Judgment and Decision Making - Considering the relative costs and benefits of potential actions to choose the most appropriate one.
Experience and Education:
High School Diploma (or GED or High School Equivalence Certificate).
Associate degree or equivalent training and certification.
5+ years of experience in data engineering including SQL, data warehousing, cloud-based data platforms.
Databricks experience.
2+ years Project Lead or Supervisory experience preferred.
Must be legally authorized to work in the United States. We are unable to sponsor or take over sponsorship at this time.
Data Engineer (Databricks)
Requirements engineer job in Columbus, OH
ComResource is searching for a highly skilled Data Engineer with a background in SQL and Databricks that can handle the design and construction of scalable management systems, ensure that all data systems meet company requirements, and also research new uses for data acquisition.
Requirements:
Design, construct, install, test and maintain data management systems.
Build high-performance algorithms, predictive models, and prototypes.
Ensure that all systems meet the business/company requirements as well as industry practices.
Integrate up-and-coming data management and software engineering technologies into existing data structures.
Develop set processes for data mining, data modeling, and data production.
Create custom software components and analytics applications.
Research new uses for existing data.
Employ an array of technological languages and tools to connect systems together.
Recommend different ways to constantly improve data reliability and quality.
Qualifications:
5+ years data quality engineering
Experience with Cloud-based systems, preferably Azure
Databricks and SQL Server testing
Experience with ML tools and LLMs
Test automation frameworks
Python and SQL for data quality checks
Data profiling and anomaly detection
Documentation and quality metrics
Healthcare data validation experience preferred
Test automation and quality process development
Plus:
Azure Databricks
Azure Cognitive Services integration
Databricks Foundational model Integration
Claude API implementation a plus
Python and NLP frameworks (spa Cy, Hugging Face, NLTK)
Java Software Engineer
Requirements engineer job in Columbus, OH
Title: Java Software Engineer
Hire Type: 12 month contract to start (potential extensions and full time hire)
Pay Range: $50/hr - $65/hr (contingent on years of experience, skills, and education)
Required Skills & Experience
Strong programming skills within Java
Jenkins experience for automating builds, CI/CD, and pipeline orchestration
experience withing in AWS environment with some exposure to cloud development
experience with event driven architecture
Job Description
Insight Global is looking for a Java Software Engineer to sit in Columbus, Ohio. This candidate will be aligned to a platform automation project within their internal ERP system. Automation efforts will be assigned to internal developers, and this resource will be working within the middle tier of their internal system. The current code is written in .NET framework, but the new code being developed will be Java based. Candidates will be working with various teams and specifically aligned to their Billing Portal within the internal system focusing on the code for transitions in the middle tier to the customer/client facing tier and back office functions. Candidates need to have worked in an AWS environment and have some exposure to event driven architecture (General structure).
Software Engineer
Requirements engineer job in Columbus, OH
Software Engineer - Internal Product Team
Division: Impower Solutions (Agility Partners)
About Impower
Impower is the technology consulting division of Agility Partners, specializing in automation & AI, data engineering & analytics, software engineering, and digital transformation. We deliver high-impact solutions with a focus on innovation, efficiency, and client satisfaction.
Role Overview
We're building a high-performing internal product team to scale our proprietary tech stack. As a Software Engineer, you'll contribute to the development of internal platforms using modern technologies. You'll collaborate with product and engineering peers to deliver scalable, maintainable solutions that drive Impower's consulting capabilities.
Key Responsibilities
Development & Implementation
Build scalable APIs using TypeScript and Bun for high-performance backend services.
Develop intelligent workflows and AI agents leveraging Temporal, enabling robust orchestration and automation.
Move and transform data using Python and DBT, supporting analytics and operational pipelines.
Contribute to full-stack development of internal websites using Next.js (frontend), Elysia (API layer), and Azure SQL Server (database).
Implement CI/CD pipelines using GitHub Actions, with a focus on automated testing, secure deployments, and environment consistency.
Deploy and manage solutions in Azure, including provisioning and maintaining infrastructure components such as App Services, Azure Functions, Storage Accounts, and SQL databases.
Monitor and troubleshoot production systems using SigNoz, ensuring observability across services with metrics, traces, and logs to maintain performance and reliability.
Write clean, testable code and contribute to unit, integration, and end-to-end test suites.
Collaborate in code reviews, sprint planning, and backlog grooming to ensure alignment and quality across the team.
Innovation & Strategy
Stay current with emerging technologies and frameworks, especially in the areas of agentic AI, orchestration, and scalable infrastructure.
Propose improvements to internal platforms based on performance metrics, developer experience, and business needs.
Contribute to technical discussions around design patterns, tooling, and long-term platform evolution.
Help evaluate open-source tools and third-party services that could accelerate development or improve reliability.
Delivery & Collaboration
Participate in agile ceremonies including sprint planning, standups, and retrospectives.
Collaborate closely with product managers, designers, and other engineers to translate requirements into working solutions.
Communicate progress, blockers, and technical decisions clearly and proactively.
Take ownership of assigned features and enhancements from ideation through deployment and support.
Leadership
Demonstrate ownership and accountability in your work, contributing to a culture of reliability and continuous improvement.
Share knowledge through documentation, pairing, and informal mentoring of junior team members.
Engage in code reviews to uphold quality standards and foster team learning.
Actively participate in team discussions and help shape a collaborative, inclusive engineering culture.
Qualifications
2-4 years of experience in software engineering, ideally in a product-focused or platform engineering environment.
Proficiency in TypeScript and Python, with hands-on experience in full-stack development.
Experience building APIs and backend services using Bun, Elysia, or similar high-performance frameworks (e.g., Fastify, Express, Flask).
Familiarity with Next.js for frontend development and Azure SQL Server for relational data storage.
Experience with workflow orchestration tools such as Temporal, Airflow, or Prefect, especially for building intelligent agents or automation pipelines.
Proficiency in data transformation using DBT, with a solid understanding of analytics engineering principles.
Strong understanding of CI/CD pipelines using GitHub Actions, including automated testing, environment management, and secure deployments.
Exposure to observability platforms such as SigNoz, Grafana, Prometheus, or OpenTelemetry, with a focus on metrics, tracing, and log aggregation.
Solid grasp of software testing practices and version control (Git).
Excellent communication skills, a collaborative mindset, and a willingness to learn and grow within a team.
Why Join Us?
Build impactful internal products that shape the future of Impower's consulting capabilities.
Work with cutting-edge technologies in a collaborative, innovation-driven environment.
Enjoy autonomy, growth opportunities, and a culture that values excellence and people.
Software Engineer
Requirements engineer job in Columbus, OH
hackajob has partnered with a global technology and management consultancy, specializing in driving transformation across the financial services and energy industries, and we're looking for Java & Python Developers!
Role: Software Engineer (Java & Python)
Mission: This role focuses on a large technology implementation with a major transition of a broker/dealer platform. These resources will support ETL development, API development, and conversion planning.
Location: On-site role in Columbus, OH.
Rates:
W2 - $32 per hour
1099 - $42 per hour
Work authorization: This role requires you to be authorized to work in the United States without sponsorship.
Qualifications (+4 years of experience):
Strong experience with Java, Spring Boot, and microservices architecture.
Proficiency in Python for ETL and automation.
Hands-on experience with API development.
Knowledge of data integration, ETL tools, and conversion workflows.
hackajob is a recruitment platform that matches you with relevant roles based on your preferences. To be matched with the roles, you need to create an account with us.
This role requires you to be based in the US.
Software Engineer
Remote requirements engineer job
Front leaning Full stack Software Engineer role (React, Typescript, Node.js, AWS, data at scale)
100% Remote
Compensation: $170K-$200K + 10% bonus
Full-time W-2 Employment with medical benefits
Client: Late stage (10 years old) Adtech startup - 300+ employees, 65 Engineers
Core Qualifications
Minimum of 10 years experience as a Software Engineer
Must have exposure around Object Oriented Design, Analysis, and Programming in multiple of the following languages: JavaScript, TypeScript, Python, NodeJS, AngularJS, React/React Native, & Vue; as well as knowledge around: API, ORM, Cloud (AWS), SOA, SaaS, messaging, stream processing, and SQL data store technologies.
Must be able to evaluate and modify complex database stored procedures, database structures, and have familiarity with containerization and scaling of SaaS platform services.
Must be able to deep-dive into various applications and data stores to produce meaningful insights, profiling and tracing, operational intelligence, customer experience visualizations, and proactive trend analyses.
Can quickly consume and understand business strategy and operating models; can apply gap analysis techniques to create long-term technical product strategy.
Can ensure technical product and social capabilities match business needs and goals.
Can effectively communicate goals, metrics, and value propositions across the Engineering Organization.
Can facilitate design, development, and support of existing and new products between cross-functional business stakeholders.
Assist team members with problem-solving complex use cases and systems; while leading technical change and transformation in parallel.
Must have knowledge around application system services, communication protocols, and standard industry technologies.
Must be passionate about creating solutions, and solving problems - in the right way, at the right time, and for the right reasons.
Must be teachable, give and receive feedback, and demonstrate success in their discipline on a consistent and transparent basis.
Education
Minimum of 10 years of experience in a product, engineering, development, or technical delivery position.
Bachelor of Science Degree in Computer Science or similar
Workday Software Engineer
Remote requirements engineer job
Positions: Software Engineer, Workday
Duration: Full time position
Type: Remote work model.
.
A day of this role:
As fully remote, this role works extensively on Workday integration projects . Responsible for designing, developing, configuring, integrating, and maintaining Workday applications and solutions. Collaborates with cross-functional teams to support business needs. Operates independently with minimal supervision.
Must haves:
7+ years of Workday Integration experience.
Understanding of Workday data conversion patterns and tools.
Proficiency in Workday integration tools:
EIB
Connectors
Workday Studio
Familiarity with Workday Business Process Framework.
Experience with Workday modules: HCM, Benefits, Time Tracking, Payroll and Security
Workday certifications.
Working knowledge of:
Workday Extend
Workday Report Writer
Calculated fields
Prism Analytics
RaaS (Reports as a Service)
Strong understanding of:
Web technologies
Mobile platforms
APIs (WSDL, SOAP, REST)
SQL
Responsibilities:
Works with constituent departments to fulfill design, application development, configuration, integration, support, and maintenance requests.
Assists in scope definition and estimation of work effort.
Contributes to the business requirements gathering process.
Works with the architecture team to ensure that design standards are followed.
Adheres to defined processes.
Develops application code to fulfill project requests.
Creates technical documentation as required.
Drives incremental improvements to team technical processes and practices.
Mentors development team members in technical complexities of assigned work.
Stays up to date with Workday releases, updates, and new features, and applies this knowledge to improve integration/extend solutions, design and performance.
Qualifications:
Bachelor's degree in computer science, a related field, or four years of related work experience is required.
Three to five years of professional experience is required.
Strong understanding of web, mobile, API, and SQL technologies.
Broad knowledge of software development practices and procedures.
Experience working with Workday modules such as HCM, Benefits, Time Tracking, Payroll and Security.
Good understanding of Workday Business Process Framework.
Good knowledge of Workday integration tools such as EIB, Connectors, Workday Studio.
Working knowledge of Workday Extend.
Working knowledge of Workday Report Writer, calculated fields, Prism.
Working knowledge of Web Services, APIs (WSDL, SOAP, REST) and RaaS.
Knowledge of Workday data conversion patterns and toolset.
Aptitude for continuous learning and improvement.
Strong teamwork skills.
Software Engineer (Remote)
Remote requirements engineer job
Remote (proximity to Chicago, Nashville or Manhattan would be a big plus)
Regular travel is not required but will need to travel to corporate office 2 times a year
Our client is looking to add a Software Developer that will be responsible for designing, developing, and maintaining high-quality software solutions that support the Firm's digital platforms. This role ensures the stability, scalability, and performance of all applications and services, while collaborating with cross-functional teams to drive continuous improvement in development practices and operational efficiency.
Responsibilities
Design and implement stable, scalable, and extensible software solutions.
Ensure adherence to secure software development lifecycle (SDLC) best practices and standards.
Drive the design and development of services and applications to meet defined service level agreements (SLAs).
Work closely with end users and stakeholders to gather requirements and iterate on solutions that deliver business value.
Proactively identify and resolve any obstacles affecting operational efficiency and service continuity.
Provide ongoing support for developed applications and services, ensuring timely issue resolution.
Participate in the Firm's change and incident management processes, adhering to established protocols.
Software Development & Architecture
Develop and maintain features for web-enabled applications using C# .NET Core.
Write clean, scalable code with a focus on maintainability and performance.
Implement robust, efficient SQL-based solutions, preferably using MS SQL.
Develop and maintain user interfaces using modern frameworks, preferably Angular or Blazor.
Ensure solutions are designed with an emphasis on security, efficiency, and optimization.
Contribute to continuous integration and continuous delivery (CI/CD) pipelines, automating processes where possible.
Collaboration & Optimization
Collaborate closely with business analysts, quality assurance, and other developers to ensure solutions meet both functional and non-functional requirements.
Foster a culture of positive, open communication across diverse teams, with a focus on collaboration and shared goals.
Engage in regular reviews and feedback sessions to drive continuous improvement in development processes and practices.
Provide mentorship and guidance to junior developers where appropriate, supporting their professional growth.
Professional Conduct
Demonstrates commitment to the firm's core values, including Accountability, Integrity, Excellence, Grit, and Love.
Ensures all activities align with business objectives and project timelines.
Communicates effectively, openly exchanging ideas and listening with consideration.
Maintains a proactive, solution-oriented mindset when addressing challenges.
Takes ownership of responsibilities and holds others accountable for their contributions.
Continuously seeks opportunities to optimize processes, improve performance, and drive innovation.
Qualifications
1-3+ years of expertise in C# .NET Core development
Competence in SQL, preferably MS SQL
Competence in UI work, preferably Angular and/or Blazor
Strong structured problem-solving skills, with a history of using systematic and fact-based processes to improve mission-critical services.
A focus on optimization and efficiency in processes.
Experience working in a financial services firm would be a big plus
Demonstrated expertise in fostering a culture of positive collaboration among cross-functional teams with diverse personalities, skill sets, and levels of experience.
Highly developed communication skills
A sense of urgency and a bias for action.
For all non-bonus, non-commission direct hire positions: The anticipated salary range for this position is ($95,000 - $120,000). Actual salary will be based on a variety of factors including relevant experience, knowledge, skills and other factors permitted by law. A range of medical, dental, vision, retirement, paid time off, and/or other benefits are available.
Sports Trading Systems Engineer
Remote requirements engineer job
What You Will Do
Write and maintain JavaScript / Node.js code for automated trading systems, background jobs, and market data ingestion
Contribute to Go services where concurrency and predictable behavior matter
Rewrite outdated or messy JavaScript services in Go
Work across multiple repositories communicating via WebSockets, Redis, and HTTP
Debug real production issues in live systems
Move fast: build, break, fix, and ship
Gradually take ownership of small but critical parts of the system
What We're Looking For
Comfortable with JavaScript / Node.js
Some exposure to Go, or interest in learning it
Understanding of async code, OOP, and event-driven systems
Not afraid of messy codebases or unfamiliar repos
Able to move quickly, ask questions, and take feedback well
Strong debugging instincts
Startup, side-project, or self-taught engineering background
Flexible availability, including occasional nights or weekends
Nice To Have
Betting, trading, or market-related experience
Experience with real-time systems (WebSockets, Redis, pub/sub)
Some infrastructure or Linux experience
Bonus: scraping or automation experience (Playwright, Puppeteer, Selenium)
What This Role Is
A high learning-curve role with an emphasis on getting systems into production
Direct visibility into how real-world trading systems are built
Working closely with a small, highly involved team
Shipping code that runs live with real money
What This Role Isn't
No formal onboarding or extensive documentation
Not a heavy-process environment (minimal tickets, meetings, or planning cycles)
Base salary: $100,000+ annually, depending on experience and role fit
Structure: Role begins with a 1-2 month paid contract engagement, followed by full-time conversion upon mutual fit
Equity: Available for the right candidate
About 4C Software
4C Software builds the technology powering one of the largest sports prediction markets in the world, with $750M+ traded on the platform this year. We also develop automated trading and market infrastructure software that operates on multiple platforms at a significant scale. We're a small team working in a fast-paced environment, building systems that run live with real money in production. Our team is based in Chicago, but this role is fully remote.