Post job

Data warehouse developer jobs near me

- 1,854 jobs
jobs
Let us run your job search
Sit back and relax while we apply to 100s of jobs for you - $25
  • ETL DataStage Lead w/ Python

    Central Point Partners 3.7company rating

    Data warehouse developer job in Columbus, OH

    *Per the client, No C2C's!* Central Point Partners is currently interviewing candidates in the Columbus, Oh area for a large client. only GC's and USC's. This position is Hybrid (4 Days onsite)! Only candidates who are local to Columbus, Oh will be considered. DataStage Lead w/ Python Note from the manager: The client is currently using DataStage as their ETL tool, but they will be eventually sunsetting DataStage and moving to Python. This will start off as primarily DataStage heavy role but will be moving towards full Python development over the next 1 to 2 years, so strong experience with both is preferred. The ideal candidate will have 8-10 years of experience with DataStage with 3-5 years of experience with Python. Position Summary: Our client's IT Corporate Finance Regulatory Reporting team is seeking a highly skilled and motivated Technical Specialist - DataStage Lead to support our enterprise data integration and regulatory reporting initiatives. This role is essential in ensuring the accuracy, efficiency, and compliance of our financial data pipelines and reporting systems. Key Responsibilities: Lead the design, development, and maintenance of ETL processes using IBM DataStage and Python. Collaborate with data architects, business analysts, and compliance teams to create and maintain technical design documents. Develop and optimize SQL queries and scripts for Snowflake and other relational databases. Write and maintain Unix shell scripts to support automation and data processing tasks. Manage and resolve incidents in a timely manner, ensuring minimal impact to business operations. Participate in change management processes, including planning, documentation, and execution of changes. Attend and contribute to project and team meetings, providing technical insights and updates. Ensure compliance with internal standards, security policies, and regulatory requirements. Mentor junior team members and provide technical leadership within the team. Required Qualifications: Bachelor's degree in Computer Science, Information Systems, or a related field. 5+ years of experience in ETL development with IBM DataStage and Python. Strong experience with Snowflake or other cloud-based data platforms. Proficiency in Unix/Linux shell scripting. Solid understanding of relational databases and SQL. Experience with incident and change management processes (ITIL framework preferred). Excellent problem-solving, analytical, and communication skills. Ability to work onsite 4 days a week in Columbus, OH Preferred Qualifications: Experience in the banking or financial services industry. Familiarity with Agile methodologies and DevOps practices. Knowledge of data governance and data quality best practices. For more information about this opportunity, please contact Bill Hart at ************ AND email your resume to **********************************!
    $69k-84k yearly est. 2d ago
  • Data Engineer

    Harvey Nash

    Remote data warehouse developer job

    We are looking for a Data Engineer in Austin, TX (fully remote - MUST work CST hours). Job Title: Data Engineer Contract: 12 Months Hourly Rate: $75- $82 per hour (only on W2) Additional Notes: Fully remote - MUST work CST hours SQL, Python, DBT, Utilize geospatial data tools (PostGIS, ArcGIS/ArcPy, QGIS, GeoPandas, etc.) to optimize and normalize spatial data storage, run spatial queries and processes to power analysis and data products Design, create, refine, and maintain data processes and pipelines used for modeling, analysis, and reporting using SQL (ideally Snowflake and PostgreSQL), Python and pipeline and transformation tools like Airflow and dbt • Conduct detailed data research on internal and external geospatial data (POI, geocoding, map layers, geometrics shapes), identify changes over time and maintain geospatial data (shape files, polygons and metadata) • Operationalize data products with detailed documentation, automated data quality checks and change alerts • Support data access through various sharing platforms, including dashboard tools • Troubleshoot failures in data processes, pipelines, and products • Communicate and educate consumers on data access and usage, managing transparency in metric and logic definitions • Collaborate with other data scientists, analysts, and engineers to build full-service data solutions • Work with cross-functional business partners and vendors to acquire and transform raw data sources • Provide frequent updates to the team on progress and status of planned work About us: Harvey Nash is a national, full-service talent management firm specializing in technology positions. Our company was founded with a mission to serve as the talent partner of choice for the information technology industry. Our company vision has led us to incredible growth and success in a relatively short period of time and continues to guide us today. We are committed to operating with the highest possible standards of honesty, integrity, and a passionate commitment to our clients, consultants, and employees. We are part of Nash Squared Group, a global professional services organization with over forty offices worldwide. For more information, please visit us at ****************************** Harvey Nash will provide benefits please review: 2025 Benefits -- Corporate Regards, Dinesh Soma Recruiting Lead
    $75-82 hourly 5d ago
  • Data Analyst

    Auralis India Pvt. Ltd.

    Remote data warehouse developer job

    · Grant Thornton is a major Audit, Tax, and Advisory Services company offering a broad range of services in strategy and consulting, operations, technology and more across various industries. Location: 100% Remote - collaborating with teams based in both the United States and Ireland. Due to the Ireland component, schedule flexibility is essential to accommodate cross-time zone coordination. This position may be offered to a candidate authorized to work in the US for his/her/their stated employer, without any restrictions which would prevent the candidate from working on the proposed assignment for the duration of the assignment period. No OT: Only seeking candidates on a straight hourly rate, no overtime rates will be approved! Duration: Contract - 12 months Responsibilities: · Serve as a technical resource for strategic oversight, planning, and development of data models and database structures to support global needs. · Translate logical designs into physical databases and define data flows through successive stages. · Plan, design, and document logical and physical enterprise relational data models. Facilitate and participate in design meetings and review sessions with development, architecture, data integration, BI teams, and power users. · Implement physical data models on platforms such as Snowflake. · Gather data requirements by working with end users. · Analyze complex data sources and develop source-to-target mapping documents, including business transformation rules. · Perform data quality analysis and profiling to ensure integrity and accuracy. · Support QA and end users during testing phases, including QA and User Acceptance Testing. · Provide daily production support and ongoing maintenance for the enterprise data warehouse. · Identify problematic data areas, research root causes, and determine corrective actions. · Support data governance by developing processes and queries to monitor and ensure data quality. · Gather, clean, and preprocess data from various sources, ensuring integrity and quality. · Identify KPIs and develop metrics to track and measure business performance. · Monitor data quality, identify issues, and propose cleansing or enhancement solutions. · Stay updated with industry trends and best practices in data analysis, modeling, and reporting. · Demonstrate strong individual contribution and teamwork, with excellent communication skills. · Adapt quickly to change with a flexible, cooperative work style and ability to reprioritize as needed. Qualifications: · Bachelor of Science (BS) in computer science or information systems (or equivalent work experience). · 7-10+ years of overall IT experience in software development or data-related roles, with evidence of increasing responsibility. · 5-7 years of significant data analysis experience, including 2-4 years building complex data models. · 2+ years of data profiling experience. · 3-5 years of strong Snowflake experience; ability to construct complex SQL queries. · Proven experience with programming languages such as SQL and Python for data manipulation and analysis. · Experience with data analysis and visualization tools such as SAP BO, Power BI and Excel. · Extensive knowledge of advanced concepts, practices, and procedures in analytic database environments. · Proficiency with best practices in data modelling, data analysis, and data warehousing concepts. · Ability to understand requirements and create complex relational data models. · Ability to create data flow and process flow diagrams. · Knowledge of BI methodologies, Data Marts, Data Warehousing, OLAP tools and techniques (a plus). · Experience in professional services, accounting industry, or client service/consultative technology roles (a plus). · Strong analytical and problem-solving skills to interpret complex data sources and generate meaningful insights. · Ability to effectively diagnose, isolate, and resolve complex problems pertaining to data infrastructure. · Good business knowledge and confident decision-making skills. · Excellent written and oral communication skills, including business writing. · Ability to communicate strategies around data modelling and architecture to cross-functional teams and business executives. · Attention to detail and ability to maintain data accuracy and integrity. · Ability to work with large datasets through data cleaning, preprocessing, and transformation techniques. · Team oriented, flexible, and able to work in an ambiguous and/or changing work environment. · Stay updated with industry trends and best practices in data analysis, modeling, and reporting. Interview Process: · 30 minute technical interview with Manager · 30 minute behavioral interview with Director
    $61k-87k yearly est. 2d ago
  • Senior SAP Developer - ETL / REMOTE

    Robinson Group 4.2company rating

    Remote data warehouse developer job

    Robinson Group has been retained to fill a newly created role in a newly created team- a Senior SAP Developer (ETL) - real REMOTE Technically strong team that is using innovative approaches, the latest technology, and strong collaboration. *This fully remote position will be part of a $17B organization but has the flexibility and mindset of a start up organization. *Growing, smart, and fully supported team that will have you leading the integration of SAP data primarily from SAP ECC and SAP S/4 HANA-into a unified, cloud-based Enterprise Data Platform (EDP). This role needs deep expertise in SAP data structures, combined with strong experience in enterprise ETL development using cloud-native technologies. As a Senior SAP Developer (ETL), you will play a key role in designing and implementing scalable data pipelines that extract, transform, and harmonize data from SAP systems into canonical models for analytics, reporting, and machine learning use cases. You will partner closely with data engineers, architects, and SAP subject matter experts to ensure accuracy, performance, and alignment with business requirements. This role will support a variety of high-impact projects focused on enabling cross-ERP visibility, operational efficiency, and data-driven decision-making across finance, manufacturing, and supply chain functions. Your contributions will help standardize critical datasets and accelerate the delivery of insights across the organization. Your skillset: Strong experience in SAP ECC and SAP HANA SAP Datasphere (building ETL pipelines) Architect and implement ETL pipelines to extract data from SAP ECC / HANA / Datasphere Design and build robust, scalable ETL/ELT pipelines to ingest data into Microsoft cloud using tools such as Azure Data Factory, or Alteryx. Analyze/interpret SAP's internal data models while working also closely with both SAP functional and technical teams Lead the end to end data integration process for SAP ECC Leverage knowledge of HANA DW to support reporting and semantic modeling Strong communication capabilities as it relates to interfacing with supply chain and finance business leaders Strong cloud knowledge (Azure is preferable, GCP, AWS, Fabric) Ability to model data/ modeling skills Expose/experience with Python (building data transformations in SQL and Python) Your background: Bachelor's degree in Computer Science, Data Science, Information Systems, or a related field. 10 years of IT experience, with 8 years of SAP experience (SAP ECC and SAP S/4HANA). Hands-on experience with Azure cloud data services including Synapse Analytics, Data Lake Storage, SQL DB. Experience building cloud-native applications, for example with Microsoft Azure, AWS or GCP
    $90k-113k yearly est. 4d ago
  • REMOTE DATA ANALYST

    Alternative Realty Group LLC

    Remote data warehouse developer job

    At [Alternative Realty Group LLC], we're proud to stand at the forefront of the Big Data revolution. Using the latest analytics tools and processes, we're able to maximize our offerings and deliver unparalleled service and support. To help carry us even further, we're searching for an Entry Level data analyst to join our team, We said Entry Level not experience data analyst. The ideal candidate will be highly skilled in all aspects of data analytics, including mining, generation, and visualization. Additionally, this person should be committed to transforming data into readable, goal-oriented reports that drive innovation and growth. Objectives of this role Develop, implement, and maintain leading-edge analytics systems, taking complicated problems and building simple frameworks Identify trends and opportunities for growth through analysis of complex datasets Evaluate organizational methods and provide source-to-target mappings and information-model specification documents for datasets Create best-practice reports based on data mining, analysis, and visualization Evaluate internal systems for efficiency, problems, and inaccuracies, and develop and maintain protocols for handling, processing, and cleaning data Work directly with managers and users to gather requirements, provide status updates, and build relationships Required skills and qualifications Entry Level mining data as a data analyst Proven analytics skills, including mining, evaluation, and visualization Technical writing experience in relevant areas, including queries, reports, and presentations Strong SQL or Excel skills, with aptitude for learning other analytics tools
    $49k-73k yearly est. 4d ago
  • Data Engineer- ETL/ELT - Hybrid/Remote

    Crown Equipment Corporation 4.8company rating

    Remote data warehouse developer job

    Crown Equipment Corporation is a leading innovator in world-class forklift and material handling equipment and technology. As one of the world's largest lift truck manufacturers, we are committed to providing the customer with the safest, most efficient and ergonomic lift truck possible to lower their total cost of ownership. Indefinite US Work Authorization Required. Primary Responsibilities Design, build and optimize scalable data pipelines and stores. Clean, prepare and optimize data for consumption in applications and analytics platforms. Participate in peer code reviews to uphold internal standards. Ensure procedures are thoroughly tested before release. Write unit tests and record test results. Detect, define and debug programs whenever problems arise. Provide training to users and knowledge transfer to support personnel and other staff members as required. Prepare system and programming documentation in accordance with internal standards. Interface with users to extract functional needs and determine requirements. Conduct detailed systems analysis to define scope and objectives and design solutions. Work with Business Analyst to help develop and write system requirements. Establish project plans and schedules and monitor progress providing status reports as required. Qualifications Bachelor's degree in Computer Science, Software/Computer Engineering, Information Systems, or related field is required. 4+ years' experience in SQL, ETL, ELT and SAP Data is required. Python, Databricks, Snowflakes experience preferred. Strong written, verbal, analytical and interpersonal skills are necessary. Remote Work: Crown offers hybrid remote work for this position. A reasonable commute is necessary as some onsite work is required. Relocation assistance is available. Work Authorization: Crown will only employ those who are legally authorized to work in the United States. This is not a position for which sponsorship will be provided. Individuals with temporary visas or who need sponsorship for work authorization now or in the future, are not eligible for hire. No agency calls please. Compensation and Benefits: Crown offers an excellent wage and benefits package for full-time employees including Health/Dental/Vision/Prescription Drug Plan, Flexible Benefits Plan, 401K Retirement Savings Plan, Life and Disability Benefits, Paid Parental Leave, Paid Holidays, Paid Vacation, Tuition Reimbursement, and much more. EOE Veterans/Disabilities
    $87k-109k yearly est. 1d ago
  • SharePoint Solution Developer

    MM Management Consultant 3.7company rating

    Data warehouse developer job in Columbus, OH

    Job Title: Sr. SharePoint Solution Developer Client: Aerospace domain Visa : USC, GC Only Exp level: 13+ years Pay rate: $80/hr on C2C(depends on the exp) No of Openings: 2 Top Skills: - SharePoint 2019 - .NET Primary Duties and Responsibilities Migrate SharePoint Server-side solutions from SharePoint 2007 to SharePoint 2016 Troubleshoot and fix SharePoint OOB and custom application issues; provide root cause analysis in a timely manner Create and maintain SharePoint sites, work with contents including site and site collection features, list, libraries, permissions and other SharePoint components. Execute product specification, system design, development, and system integration Participate in product and program collaboration Refactor SharePoint server-side applications and services to latest SharePoint platforms Maintain, configure, and improve SharePoint solutions and artifacts post migration Complete other tasks as required Experience, Education and Skills 5+ years of SharePoint server-side solution development experience using SharePoint 2007 through SharePoint 2016 8+ years in any software development role Extensive knowledge of C#, .Net framework and ASP.Net Extensive knowledge of Microsoft Internet Information Services (IIS) Extensive knowledge of Site templates, SharePoint custom and OOB master pages and page layouts Extensive knowledge of SharePoint server artifacts and services Extensive knowledge of Microsoft SQL Server including SQL queries and other SQL Components, Performance troubleshooting and fixing performance issues Strong knowledge of InfoPath forms development with code behind and migration Strong knowledge of various authentication methods and Kerberos Experience using third-party migration tools such as Sharegate is a plus Strong knowledge of object-oriented programming Strong Web Development: HTML5, CSS 3 and JavaScript libraries Strong knowledge of web service models: SOAP, OData, REST Experience in Client-side debugging, ULS log analysis and Network trace analysis Experience developing client-side solutions using SharePoint Framework is a plus Experience with TFS and Git General Requirements Exhibit and practice courteous, ethical and professional behavior while interacting with both internal and external customers Act in a collaborative, team-oriented environment focused on common goals to achieve mutually beneficial results Be accountable and responsible for the accuracy and completeness of assigned work and results Prioritize and manage workload and communicate issues clearly Exhibit effective verbal and written communication skills Comply with all laws, regulations and company policies
    $80 hourly 2d ago
  • Senior Data Engineer.

    Pyramid Consulting, Inc. 4.1company rating

    Data warehouse developer job in Columbus, OH

    Immediate need for a talented Senior Data Engineer. This is a 06+ months contract opportunity with long-term potential and is located in Columbus, OH(Remote). Please review the job description below and contact me ASAP if you are interested. Job ID: 25-95277 Pay Range: $70 - $71 /hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location). Key Responsibilities: Working with Marketing data partners and build data pipelines to automate the data feeds from the partners to internal systems on Snowflake. Working with Data Analysts to understand their data needs and prepare the datasets for analytics. Work with Data Scientists to build the infrastructure to deploy the models, monitor the performance, and build the necessary audit infrastructure. Key Requirements and Technology Experience: Key skills; Snowflake, Python and AWS Experience with building data pipelines, data pipeline infrastructure and related tools and environments used in analytics and data science (ex: Python, Unix) Experience in developing analytic workloads with AWS Services, S3, Simple Queue Service (SQS), Simple Notification Service (SNS), Lambda, EC2, ECR and Secrets Manager. Strong proficiency in Python, SQL, Linux/Unix shell scripting, GitHub Actions or Docker, Terraform or CloudFormation, and Snowflake. Order of Importance: Terraform, Docker, GitHub Actions OR Jenkins Experience with orchestration tools such as Prefect, DBT, or Airflow. Experience automating data ingestion, processing, and reporting/monitoring. Experience with other relevant tools used in data engineering (e.g., SQL, GIT, etc.) Ability to set up environments (Dev, QA, and Prod) using GitHub repo and GitHub rules/methodologies; how to maintain (via SQL coding and proper versioning) Our client is a leading Insurance Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, colour, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. By applying to our jobs, you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
    $70-71 hourly 5d ago
  • Junior Data Engineer

    Brooksource 4.1company rating

    Data warehouse developer job in Columbus, OH

    Contract-to-Hire Columbus, OH (Hybrid) Our healthcare services client is looking for an entry-level Data Engineer to join their team. You will play a pivotal role in maintaining and improving inventory and logistics management programs. Your day-to-day work will include leveraging machine learning and open-source technologies to drive improvements in data processes. Job Responsibilities Automate key processes and enhance data quality Improve injection processes and enhance machine learning capabilities Manage substitutions and allocations to streamline product ordering Work on logistics-related data engineering tasks Build and maintain ML models for predictive analytics Interface with various customer systems Collaborate on integrating AI models into customer service Qualifications Bachelor's degree in related field 0-2 years of relevant experience Proficiency in SQL and Python Understanding of GCP/BigQuery (or any cloud experience, basic certifications a plus). Knowledge of data science concepts. Business acumen and understanding (corporate experience or internship preferred). Familiarity with Tableau Strong analytical skills Attitude for collaboration and knowledge sharing Ability to present confidently in front of leaders Why Should You Apply? You will be part of custom technical training and professional development through our Elevate Program! Start your career with a Fortune 15 company! Access to cutting-edge technologies Opportunity for career growth Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
    $86k-117k yearly est. 2d ago
  • Senior Data Engineer(only W2)

    CBTS 4.9company rating

    Data warehouse developer job in Columbus, OH

    Bachelor's Degree in Computer Science or related technical field AND 5+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, or Java. Proficiency with Azure data services, such as Azure Data Lake, Azure Data Factory and Databricks. Expertise using Cloud Security (i.e., Active Directory, network security groups, and encryption services). Proficient in Python for developing and maintaining data solutions. Experience with optimizing or managing technology costs. Ability to build and maintain a data architecture supporting both real-time and batch processing. Ability to implement industry standard programming techniques by mastering advanced fundamental concepts, practices, and procedures, and having the ability to analyze and solve problems in existing systems. Expertise with unit testing, integration testing and performance/stress testing. Database management skills and understanding of legacy and contemporary data modeling and system architecture. Demonstrated leadership skills, team spirit, and the ability to work cooperatively and creatively across an organization Experience on teams leveraging Lean or Agile frameworks.
    $68k-95k yearly est. 2d ago
  • Senior Data Engineer

    Psi (Proteam Solutions 3.9company rating

    Data warehouse developer job in Columbus, OH

    Responsible for understanding, preparing, processing, and analyzing data to make it valuable and useful for operations decision support. Accountabilities in this role include: Partnering with Business Analysis and Analytics teams. Demonstrating problem-solving ability for effective and timely resolution of system issues, including production outages. Developing and supporting standard processes to harvest data from various sources and perform data blending to develop advanced data sets, analytical cubes, and data exploration. Utilizing queries, data exploration and transformation, and basic statistical methods. Creating Python scripts. Developing Microsoft SQL Server Integration Services Workflows. Building Microsoft SQL Server Analysis Services Tabular Models. Focusing on SQL database work with a blend of strong technical and communication skills. Demonstrating ability to learn and navigate in large complex environments. Exhibiting Excel acumen to develop complex spreadsheets, formulas, create macros, and understand VBA code within the modules. Required Skills: Experience with MS SQL Proficiency in Python Desired Skills: Experience with SharePoint Advanced Excel Skills (formulas, VBA, Power Pivot, Pivot Table)
    $84k-117k yearly est. 4d ago
  • Senior Data Analytics Engineer

    Revel It 4.3company rating

    Data warehouse developer job in Columbus, OH

    We are seeking a highly skilled Analytics Data Engineer with deep expertise in building scalable data solutions on the AWS platform. The ideal candidate is a 10/10 expert in Python and PySpark, with strong working knowledge of SQL. This engineer will play a critical role in translating business and end-user needs into robust analytics products-spanning ingestion, transformation, curation, and enablement for downstream reporting and visualization. You will work closely with both business stakeholders and IT teams to design, develop, and deploy advanced data pipelines and analytical capabilities that power enterprise decision-making. Key Responsibilities Data Engineering & Pipeline Development Design, develop, and optimize scalable data ingestion pipelines using Python, PySpark, and AWS native services. Build end-to-end solutions to move large-scale big data from source systems into AWS environments (e.g., S3, Redshift, DynamoDB, RDS). Develop and maintain robust data transformation and curation processes to support analytics, dashboards, and business intelligence tools. Implement best practices for data quality, validation, auditing, and error-handling within pipelines. Analytics Solution Design Collaborate with business users to understand analytical needs and translate them into technical specifications, data models, and solution architectures. Build curated datasets optimized for reporting, visualization, machine learning, and self-service analytics. Contribute to solution design for analytics products leveraging AWS services such as AWS Glue, Lambda, EMR, Athena, Step Functions, Redshift, Kinesis, Lake Formation, etc. Cross-Functional Collaboration Work with IT and business partners to define requirements, architecture, and KPIs for analytical solutions. Participate in Daily Scrum meetings, code reviews, and architecture discussions to ensure alignment with enterprise data strategy and coding standards. Provide mentorship and guidance to junior engineers and analysts as needed. Engineering (Supporting Skills) Employ strong skills in Python, Pyspark and SQL to support data engineering tasks, broader system integration requirements, and application layer needs. Implement scripts, utilities, and micro-services as needed to support analytics workloads. Required Qualifications 5+ years of professional experience in data engineering, analytics engineering, or full-stack data development roles. Expert-level proficiency (10/10) in: Python PySpark Strong working knowledge of: SQL and other programming languages Demonstrated experience designing and delivering big-data ingestion and transformation solutions through AWS. Hands-on experience with AWS services such as Glue, EMR, Lambda, Redshift, S3, Kinesis, CloudFormation, IAM, etc. Strong understanding of data warehousing, ETL/ELT, distributed computing, and data modeling. Ability to partner effectively with business stakeholders and translate requirements into technical solutions. Strong problem-solving skills and the ability to work independently in a fast-paced environment. Preferred Qualifications Experience with BI/Visualization tools such as Tableau Experience building CI/CD pipelines for data products (e.g., Jenkins, GitHub Actions). Familiarity with machine learning workflows or MLOps frameworks. Knowledge of metadata management, data governance, and data lineage tools.
    $88k-120k yearly est. 4d ago
  • Data Engineer

    Agility Partners 4.6company rating

    Data warehouse developer job in Columbus, OH

    We're seeking a skilled Data Engineer based in Columbus, OH, to support a high-impact data initiative. The ideal candidate will have hands-on experience with Python, Databricks, SQL, and version control systems, and be comfortable building and maintaining robust, scalable data solutions. Key Responsibilities Design, implement, and optimize data pipelines and workflows within Databricks. Develop and maintain data models and SQL queries for efficient ETL processes. Partner with cross-functional teams to define data requirements and deliver business-ready solutions. Use version control systems to manage code and ensure collaborative development practices. Validate and maintain data quality, accuracy, and integrity through testing and monitoring. Required Skills Proficiency in Python for data engineering and automation. Strong, practical experience with Databricks and distributed data processing. Advanced SQL skills for data manipulation and analysis. Experience with Git or similar version control tools. Strong analytical mindset and attention to detail. Preferred Qualifications Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with enterprise data lake architectures and best practices. Excellent communication skills and the ability to work independently or in team environments.
    $95k-127k yearly est. 4d ago
  • REMOTE, SQL Server Support Engineer / SQL Developer / Data Systems Support Engineer, JO 25-14057

    Teambradley, Inc.

    Remote data warehouse developer job

    Are you a goal-oriented professional with an entrepreneurial spirit? This role is designed for someone who loves owning outcomes and growing their technical impact. And career growth is part of the plan at this company. The Short Scoop: Bring data to life for customers who depend on reliable insight. As a key client support person, you'll own the health, accuracy, and performance of complex data pipelines while building trusted relationships with customers. This is not a “ticket-and-pass-off” role. You're the linchpin who diagnoses, solves, and ensures lasting fixes that keep business intelligence flowing. Location: Must reside in the United States. Strong preference for candidates located in the greater Chicago, IL area. Work Authorization: Applicants must be U.S. citizens or lawful permanent residents (Green Card holders). Why This Role Is Worth Your Time Full ownership of analysis, troubleshooting, and long-term solution delivery. No hand-offs or forgotten or disappearing tasks. Real-world impact supporting organizations that depend on your data accuracy, integrity, and technical judgment. Projects involving SQL, Power BI, data connectors, software upgrades, VMs, and VPN configuration and production environments. Autonomy balanced with collaboration in a company that values initiative, accountability, and continual learning. Career growth is part of the plan. Master this role, and step toward software development, configuration, client development, or support leadership. The key to the health and well-being of all employees is knowing you have insurance that takes care of you and your family. This company knows and understands that. Benefits: medical, dental, vision, and a 401k Compensation range: $75,000 to $90,000 annually What You'll Be Doing Monitor and maintain customer data integrations to ensure accuracy and timely BI delivery. Diagnose and resolve complex data processing issues using intermediate T-SQL on Microsoft SQL Server. Apply and verify software upgrades, patches, and point releases in production environments. Manage multiple cases with attention to detail, clear prioritization, and full follow-through with proactive communication. Confirm status and resolutions directly with customers before closing cases. Identify and implement process improvements that increase efficiency and reliability. About You Be able to do the job as described. Degree in Computer Science, Data Science, Mathematics, or a related technical field. 2 to 5 years of professional experience developing in SQL, including building complex queries, stored procedures, functions, and performance tuning. Hands-on experience developing in at least one compiled or low-level language such as C#, C++, Java, or similar. Strong organizational habits and the ability to manage your work independently with consistent follow-through. Confident working across VMs, VPNs, and Windows-based production servers. Energized by mastering complexity and digging into how systems behave - your curiosity and focus drive results and growth. Communication is part of your success plan, with your boss, your teammates, and customers via email, phone call, and detailed documentation. How To Apply: We'd love to see your resume, but we don't need it to have a conversation. It is as easy as one, two, three! Send an email directly to me, *********************************** and tell me why you're interested. Message me here on LinkedIn. If you do have your resume ready to go, apply now on this site. Setting Expectations: We'd love to help every single person who is interested and applies to this role. Unfortunately, too many people apply who don't appear capable of doing the job. We apologize in advance, however, we will not be able to respond directly to all submissions. Sponsorship is not an option for this role. This client is an Equal Opportunity Employer This is a REMOTE ROLE! TBI Id No: JO#25-14057, REMOTE, SQL Server Support Engineer / SQL Developer / Data Systems Support Engineer
    $75k-90k yearly 3d ago
  • Data Engineer

    Alexander Technology Group 4.3company rating

    Remote data warehouse developer job

    This is a fully remote 12+ month contract position. No C2C or 3rd party candidates will be considered. Data Engineer (AI & Automation) We are seeking a Data Engineer with hands-on experience using AI-driven tools to support automation, system integrations, and continuous process improvement across internal business systems. This role will focus on building and maintaining scalable data pipelines, enabling intelligent workflows, and improving data accessibility and reliability. Key Responsibilities Design, build, and maintain automated data pipelines and integrations across internal systems Leverage AI-enabled tools to streamline workflows and drive process improvements Develop and orchestrate workflows using Apache Airflow and n8n AI Model, transform, and optimize data in Snowflake and Azure SQL Data Warehouse Collaborate with business and technical teams to identify automation opportunities Ensure data quality, reliability, and performance across platforms Required Qualifications Experience as a Data Engineer or similar role Hands-on experience with Apache Airflow and modern workflow orchestration tools Strong experience with Snowflake and Azure SQL Data Warehouse Familiarity with AI-driven automation and integration tools (e.g., n8n AI) Strong SQL skills and experience building scalable data pipelines Preferred Qualifications Experience integrating multiple internal business systems Background in process improvement or operational automation Experience working in cloud-based data environments (Azure preferred)
    $95k-138k yearly est. 4d ago
  • Data Engineer

    Iqventures

    Data warehouse developer job in Dublin, OH

    The Data Engineer is a technical leader and hands-on developer responsible for designing, building, and optimizing data pipelines and infrastructure to support analytics and reporting. This role will serve as the lead developer on strategic data initiatives, ensuring scalable, high-performance solutions are delivered effectively and efficiently. The ideal candidate is self-directed, thrives in a fast-paced project environment, and is comfortable making technical decisions and architectural recommendations. The ideal candidate has prior experience in modern data platforms, most notable Databricks and the “lakehouse” architecture. They will work closely with cross-functional teams, including business stakeholders, data analysts, and engineering teams, to develop data solutions that align with enterprise strategies and business goals. Experience in the financial industry is a plus, particularly in designing secure and compliant data solutions. Responsibilities: Design, build, and maintain scalable ETL/ELT pipelines for structured and unstructured data. Optimize data storage, retrieval, and processing for performance, security, and cost-efficiency. Ensure data integrity and governance by implementing robust validation, monitoring, and compliance processes. Consume and analyze data from the data pipeline to infer, predict and recommend actionable insight, which will inform operational and strategic decision making to produce better results. Empower departments and internal consumers with metrics and business intelligence to operate and direct our business, better serving our end customers. Determine technical and behavioral requirements, identify strategies as solutions, and section solutions based on resource constraints. Work with the business, process owners, and IT team members to design solutions for data and advanced analytics solutions. Perform data modeling and prepare data in databases for analysis and reporting through various analytics tools. Play a technical specialist role in championing data as a corporate asset. Provide technical expertise in collaborating with project and other IT teams, internal and external to the company. Contribute to and maintain system data standards. Research and recommend innovative, and where possible automated approaches for system data administration tasks. Identify approaches that leverage our resources and provide economies of scale. Engineer system that balances and meets performance, scalability, recoverability (including backup design), maintainability, security, high availability requirements and objectives. Skills: Databricks and related - SQL, Python, PySpark, Delta Live Tables, Data pipelines, AWS S3 object storage, Parquet/Columnar file formats, AWS Glue. Systems Analysis - The application of systems analysis techniques and procedures, including consulting with users, to determine hardware, software, platform, or system functional specifications. Time Management - Managing one's own time and the time of others. Active Listening - Giving full attention to what other people are saying, taking time to understand the points being made, asking questions as appropriate, and not interrupting at inappropriate times. Critical Thinking - Using logic and reasoning to identify the strengths and weaknesses of alternative solutions, conclusions or approaches to problems. Active Learning - Understanding the implications of new information for both current and future problem-solving and decision-making. Writing - Communicating effectively in writing as appropriate for the needs of the audience. Speaking - Talking to others to convey information effectively. Instructing - Teaching others how to do something. Service Orientation - Actively looking for ways to help people. Complex Problem Solving - Identifying complex problems and reviewing related information to develop and evaluate options and implement solutions. Troubleshooting - Determining causes of operating errors and deciding what to do about it. Judgment and Decision Making - Considering the relative costs and benefits of potential actions to choose the most appropriate one. Experience and Education: High School Diploma (or GED or High School Equivalence Certificate). Associate degree or equivalent training and certification. 5+ years of experience in data engineering including SQL, data warehousing, cloud-based data platforms. Databricks experience. 2+ years Project Lead or Supervisory experience preferred. Must be legally authorized to work in the United States. We are unable to sponsor or take over sponsorship at this time.
    $76k-103k yearly est. 2d ago
  • Lead Dotnet Developer

    Comtech Global, Inc. 4.3company rating

    Data warehouse developer job in Columbus, OH

    Direct client - .NET Developer, Columbus, OH - Onsite 15 minute Teams interviews followed by full on-site interview. This is an onsite role working onsite at the Shipley building at 1970 West Broad Street, Columbus. The selected Consultants will work closely with the designated I.T. supervisor, project managers, business analysts, QA developers and other developers in an agile development environment. The work scope of the consultant will include requirements gathering, application design, application development, application testing, creating presentations, demoing changes, deploying changes and providing technical customer support. The Consultants, in working with ODPS staff, will be responsible for new development and/or rewriting legacy applications to .NET applications utilizing C#, ASP.NET Core MVC, Web API, Windows Forms, SQL Server and other technologies. The new applications will implement an N-tier architecture and comply with client coding standards. During the interview process with the client staff, the resource consultant must demonstrate competence/experience in their specific area(s) of project assignment. The resource's experience must also be documented for review and verification. Offered resources not showing technical or functional competence/experience will be sufficient reason to reject the Offeror's proposal. It is the responsibility of the Offeror to pre-screen their candidates to ensure compliance. Resource will have a background check conducted by client. Skill Required / Desired Amount of Experience Experience as a .NET Web Developer developing ASP.NET Core MVC applications Required 10 Years ASP.NET experience developing with the C-Sharp (C#) language Required 10 Years Experience with SQL Server Database Design and development including optimization of queries, creating tables, views, stored procedures, and functions Required 10 Years Experience developing web applications utilizing the 4.0 .NET Framework or higher Required 10 Years Experience developing applications using Entity Framework (EF) 4.0 or later. Required 10 Years Verifiable Service Oriented Architecture (SOA) experience developing and securing Windows Communication Foundation (WCF). Required 10 Years Experience or a demonstrable understanding of N-tier environments as it relates to development and deployment. Required 10 Years Experience or a demonstrable understanding of code repository strategies, code promotion strategies and recovery using Azure DevOps (TFS). Required 10 Years Experience developing a project within the AGILE methodology using TFS / Devops. Required 5 Years Experience implementing Asynchronous JavaScript and XML (AJAX) enabled controls within Active Server Pages (ASP).NET Web applications. Required 5 Years Experience or a demonstrable understanding of developing applications that are mobile compliant Required 5 Years Experience developing reports in SQL Server Reporting Services (SSRS) and making them accessible within an MVC web application. Required 5 Years Experience or a demonstrable understanding of developing applications that are mobile compliant. Required 5 Years Experience developing with Visual Studio; 2 years with Visual Studio 2019 Required 5 Years Experience with TFS/Azure DevOps including git, boards and CI/CD pipelines Required 5 Years Experience developing RESTful APIs and web services using .NET Core and .NET 5. Required 2 Years Experience in Test Driven Development (TDD) or a verifiable experience implementing a testing strategy for applications developed. Required 2 Years The selected Consultants must possess strong: 1. Communication and leadership skills. 2. Facilitation skills to foster collaborative and innovative spaces. 3. Collaboration and negotiation skills to support resources across business and functional lines. 4. Confidence and influential communication skills to present ideas and solutions. 5. Oral and written English language skills to articulate collaborative ideas clearly. 6. Independent coding skills. 7. Complimentary team player mindset. 8. Time and resource management and prioritization skills to meet assigned deadlines. 9. Excellent and proven organizational, analytical, planning, problem solving and decision-making skills. 10. Leadership skills to provide technical guidance and mentoring to technical staff.
    $85k-130k yearly est. 3d ago
  • Data Engineer (Databricks)

    Comresource 3.6company rating

    Data warehouse developer job in Columbus, OH

    ComResource is searching for a highly skilled Data Engineer with a background in SQL and Databricks that can handle the design and construction of scalable management systems, ensure that all data systems meet company requirements, and also research new uses for data acquisition. Requirements: Design, construct, install, test and maintain data management systems. Build high-performance algorithms, predictive models, and prototypes. Ensure that all systems meet the business/company requirements as well as industry practices. Integrate up-and-coming data management and software engineering technologies into existing data structures. Develop set processes for data mining, data modeling, and data production. Create custom software components and analytics applications. Research new uses for existing data. Employ an array of technological languages and tools to connect systems together. Recommend different ways to constantly improve data reliability and quality. Qualifications: 5+ years data quality engineering Experience with Cloud-based systems, preferably Azure Databricks and SQL Server testing Experience with ML tools and LLMs Test automation frameworks Python and SQL for data quality checks Data profiling and anomaly detection Documentation and quality metrics Healthcare data validation experience preferred Test automation and quality process development Plus: Azure Databricks Azure Cognitive Services integration Databricks Foundational model Integration Claude API implementation a plus Python and NLP frameworks (spa Cy, Hugging Face, NLTK)
    $79k-102k yearly est. 5d ago
  • Senior Data Architect

    Intelliswift-An LTTS Company

    Data warehouse developer job in Marysville, OH

    4 days onsite - Marysville, OH Skillset: Bachelor's degree in computer science, data science, engineering, or related field 10 years minimum relevant experience in design and implementation of data models (Erwin) for enterprise data warehouse initiatives Experience leading projects involving cloud data lakes, data warehousing, data modeling, and data analysis Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS), real-time data distribution (Kinesis, Kafka, Dataflow), and modern data warehouse tools (Redshift, Snowflake, Databricks) Experience with various database platforms, including DB2, MS SQL Server, PostgreSQL, Couchbase, MongoDB, etc. Understanding of entity-relationship modeling, metadata systems, and data security, quality tools and techniques Ability to design traditional/relational and modern big-data architecture based on business needs Experience with business intelligence tools and technologies such as Informatica, Power BI, and Tableau Exceptional communication and presentation skills Strong analytical and problem-solving skills Ability to collaborate and excel in complex, cross-functional teams involving data scientists, business analysts, and stakeholders Ability to guide solution design and architecture to meet business needs.
    $93k-125k yearly est. 5d ago
  • Application Developer

    Manifest Solutions 4.6company rating

    Data warehouse developer job in Newark, OH

    Manifest Solutions is currently seeking an Application Developer for a hybrid position in Newark, OH. - MUST be within commuting distance of Newark, OH. NO C2C; NO 3rd Parties - Applicants ONLY Decommission and move apps to ServiceNow Re-writing apps in .Net Designs, codes, tests, documents, releases, and supports custom software applications to meet specific technology needs. Gather and analyze functional business/user requirements. Define the technical requirements for the custom software by analyzing systems and processes-including their dependencies and interactions-in the context of the business' technology need(s). Prepare Scope of Work that describe in detail the components that will be developed and the methods that will be used - including written requirements and time estimates; entity relationship, user flow, and data flow diagrams; permissions and roles matrices; and other applicable design artifacts. Create prototypes that enable business users to verify that functionality will meet the specific business technology need(s). Develop software solution(s) in accordance with the business and technical requirements by writing and implementing source code. Test and debug source code. Develop and maintain technical documentation that represents the current state design and code of custom software applications. Develop and maintain user guides that describe the features and functionality of custom software applications. Periodically evaluate existing custom software applications to assess code quality and functional integrity over time. Update existing custom software applications as necessary to fix errors, adapt to new hardware, improve performance, refactor code, enhance interfaces and/or implement new features/functionality. Qualifications 2 -4 yrs - Applied specialized experience with the following: Angular 4+, .NET, C# 7+, SQL, VS Code/Visual Studio, GitLab, PostMan, API Design and Development Formal education in software development, web development, or similar focus or equivalent professional experience Familiar with and able to adhere to a formal SDLC program Understands and can implement Secure Coding Practices (e.g. OWASP Top 10) Experience doing containerized development Cloud Development experience (Azure, AWS) Familiarity with the following types of testing: User Testing, Integration Testing, Isolation Testing, Load Testing, End to End Testing, Vulnerability Testing
    $74k-98k yearly est. 1d ago

Learn more about data warehouse developer jobs

Browse computer and mathematical jobs