Post job

Data engineer jobs in Cincinnati, OH - 498 jobs

All
Data Engineer
Data Scientist
ETL Architect
Requirements Engineer
  • Program Integrity Data Scientist II

    Caresource 4.9company rating

    Data engineer job in Dayton, OH

    The Program Integrity Data Scientist II is responsible for developing, implementing, managing, and deploying in-depth analyses that meet the information needs associated with payment accuracy, anomaly detection, and Fraud, Waste, and Abuse (FWA). Essential Functions: Build concepts as algorithms that identify claims for pre- or post-pay intervention based on probability of fraud, waste, and abuse. Algorithms are implemented into production workflows for action: medical record request and audit, downcode adjustment, denial and remittance communication, etc. Analyze and quantify claim payment issues and provide recommendations to mitigate identified program integrity risks. Identify trends and patterns using standard corporate, processes, tools, reports and databases as well as leveraging other processes and data sources. Conduct outcome analyses to determine impact and effectiveness of corporate program and payment integrity initiatives. Collaborate on the examination and explanation of complex data relationships to answer questions identified either within the department or by other departments as it relates to payment accuracy, anomaly detection, and FWA. Monitoring of and providing explanation of anomalies related to trends associated with the potential for Fraud Waste and Abuse across the corporate enterprise. Collaborate with the Legal Department, generating data and analyses to support Legal proceedings. Develop hypothesis tests and extrapolations on statistically valid samples to establish outlier behavior patterns and potential recoupment. Create, maintain, and communicate an analytical plan for each project. Mine and analyze large structured and unstructured datasets. Employ wide range of data sources to develop algorithms for predicting risk and understanding drivers, detecting outliers, etc. Develop visualizations that demonstrate the efficacy of developed algorithms. Provide statistical validation and analysis of outcomes associated with clinical programs and interventions. Collaborate with other teams to integrate with existing solutions. Communicate results and ideas to key stakeholders. Prepare code for operationalization of end-to-end model pipeline and deliverable for business consumption. Perform any other job related duties as requested. Education and Experience: Bachelor's degree in Data Science, Mathematics, Statistics, Engineering, Computer Science, or a related field required Equivalent years of relevant work experience may be accepted in lieu of required education Three (3) years data analysis and/or analytic programming required Experience with cloud services (such as Azure, AWS or GCP) and modern data stack (such as Databricks or Snowflakes) preferred Healthcare experience required Competencies, Knowledge and Skills: Proficient in SQL and at least one of the following programming languages: Python / R / RAT STAT Familiarity with SAS is preferred Preferred beginner level of knowledge of developing reports or dashboards in Power BI or other business intelligence applications Ability to perform advanced statistical analyses and techniques including t-tests, ANOVAs, z-tests, statistical extrapolations, non-parametric significance testing, and sampling methodologies Working knowledge of predictive modeling and machine learning algorithms such as generalized linear models, non-linear supervised learning models, clustering, decision trees, dimensionality reduction and natural language processing Proficient in feature engineering techniques and exploratory data analysis Familiarity with optimization techniques and artificial intelligence methods Ability to analyze large quantities of information and identify patterns, irregularities, and deficiencies Knowledge of healthcare coding and billing processes, including CPT4, HCPCS, ICD-9, DRG and Revenue Codes preferred Proficient with MS office (Excel, PowerPoint, Word, Access) Demonstrated critical thinking, verbal communication, presentation and written communication skills Ability to work independently and within a cross-functional team environment Licensure and Certification:Working Conditions: General office environment; may be required to sit or stand for extended periods of time Up to 15% (occasional) travel to attend meetings, trainings, and conferences may be required Compensation Range: $83,000.00 - $132,800.00 CareSource takes into consideration a combination of a candidate's education, training, and experience as well as the position's scope and complexity, the discretion and latitude required for the role, and other external and internal data when establishing a salary level. In addition to base compensation, you may qualify for a bonus tied to company and individual performance. We are highly invested in every employee's total well-being and offer a substantial and comprehensive total rewards package. Compensation Type: Salary Competencies: - Fostering a Collaborative Workplace Culture - Cultivate Partnerships - Develop Self and Others - Drive Execution - Influence Others - Pursue Personal Excellence - Understand the Business This is not all inclusive. CareSource reserves the right to amend this job description at any time. CareSource is an Equal Opportunity Employer. We are dedicated to fostering an environment of belonging that welcomes and supports individuals of all backgrounds. #LI-GB1
    $83k-132.8k yearly 5d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Data Scientist

    Procter & Gamble 4.8company rating

    Data engineer job in Cincinnati, OH

    Do you enjoy solving billion-dollar data science problems across trillions of data points? Are you passionate about working at the cutting edge of interdisciplinary boundaries, where computer science meets hard science? If you like turning untidy data into nonobvious insights and surprising business leaders with the transformative power of Artificial Intelligence (AI), including Generative and Agentic AI, we want you on our team at P&G. As a Data Scientist in our organization, you will play a crucial role in disrupting current business practices by designing and implementing innovative models that enhance our processes. You will be expected to constructively research, design, and customize algorithms tailored to various problems and data types. Utilizing your expertise in Operations Research (including optimization and simulation) and machine learning models (such as tree models, deep learning, and reinforcement learning), you will directly contribute to the development of scalable Data Science algorithms. Your work will also integrate advanced techniques from Generative and Agentic AI to create more dynamic and responsive models, enhancing our analytical capabilities. You will collaborate with Data and AI Engineering teams to productionize these solutions, applying exploratory data analysis, feature engineering, and model building within cloud environments on massive datasets to deliver accurate and impactful insights. Additionally, you will mentor others as a technical coach and become a recognized expert in one or more Data Science techniques, quantifying the improvements in business outcomes resulting from your work. Key Responsibilities: + Algorithm Design & Development: Directly contribute to the design and development of scalable Data Science algorithms. + Collaboration: Work closely with Data and Software Engineering teams to effectively productionize algorithms. + Data Analysis: Apply thorough technical knowledge to large datasets, conducting exploratory data analysis, feature engineering, and model building. + Coaching & Mentorship: Develop others as a technical coach, sharing your expertise and insights. + Expertise Development: Become a known expert in one or multiple Data Science techniques and methodologies. Job Qualifications Required Qualifications: + Education: Pursuing or has graduated with a Master's degree in a quantitative field (Operations Research, Computer Science, Engineering, Applied Mathematics, Statistics, Physics, Analytics, etc.) or possess equivalent work experience. + Technical Skills: Proficient in programming languages such as Python and familiar with data science/machine learning libraries like OpenCV, scikit-learn, PyTorch, TensorFlow/Keras, and Pandas. Demonstrated ability to develop and test code within cloud environments. + Communication: Strong written and verbal communication skills, with the ability to influence others to take action. Preferred Qualifications: + Analytic Methodologies: Experience applying analytic methodologies such as Machine Learning, Optimization, Simulation, and Generative and Agentic AI to real-world problems. + Continuous Learning: A commitment to lifelong learning, keeping up to date with the latest technology trends, and a willingness to teach others while learning new techniques. + Data Handling & Cloud: Experience with large datasets and developing in cloud computing platforms such as GCP or Azure. + DevOps Familiarity: Familiarity with DevOps environments, including tools like Git and CI/CD practices. Immigration Sponsorship is not available for this role. For more information regarding who is eligible for hire at P&G along with other work authorization FAQ's, please click HERE (******************************************************* . Procter & Gamble participates in e-verify as required by law. Qualified individuals will not be disadvantaged based on being unemployed. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Job Schedule Full time Job Number R000135859 Job Segmentation Entry Level Starting Pay / Salary Range $85,000.00 - $115,000.00 / year
    $85k-115k yearly 60d+ ago
  • ETL Architect

    Scadea Solutions

    Data engineer job in Cincinnati, OH

    Job title: ETL Architect DURATION 18 months YEARS OF EXPERIENCE 7-10 INTERVIEW TYPE Phone Screen to Hire REQUIRED SKILLS • Experience with Data Stage and ETL design Technical • Requirement gathering , converting business requirements to technical specs to profile • Worked hands on in minimum 2 projects with data stage • Understand the process of developing an etl design that support multiple datastage developers • Be able to create an etl design framework and related specifications for use by etl developers • Define standards and best practices of Data Stage etl to be followed by all data stage developers • Understanding of Data Warehouse, Data marts concepts and implementation experience • Be able to look at code produced to insure conformance with developed ETL framework and design for reuse • Preferable experienced user level comptency in IBM's metadata product, datastage and Infosphere product line • Be able to design etl for oracle or sql server or any db • Good analytical skills and process design • Insuring compliance to quality standards, and delivery timelines. Qualifications Bachelors Additional Information Required Skills: Job Description: Performs highly complex application programming/systems development and support Performs highly complex configuration of business rules and technical parameters of software products Review business requirements and develop application design documentation Build technical components (Maximo objects, TRM Rules, Java extensions, etc) based on detailed design. Performs unit testing of components along with completing necessary documentation. Supports product test, user acceptance test, etc as a member of the fix-it team. Employs consistent measurement techniques Include testing in project plans and establish controls to require adherence to test plans Manages the interrelationships among various projects or work objectives
    $86k-113k yearly est. 2d ago
  • AI Data Scientist

    Medpace 4.5company rating

    Data engineer job in Cincinnati, OH

    We are currently seeking an experienced data scientist to join our AI team who will support and lead data flow, advanced analytical needs and AI tools across Medpace. The AI team utilizes analytical principles and techniques to identify, collate and analyze many data sources and works with teams across Medpace to support efficiency and business gains for pharmaceutical development. The AI Data Scientist will support various projects across the company to bring data sources together in a consistent manner, work with the business to identify the value of AI, identify appropriate solutions and work with IT to ensure they are developed and built into the relevant systems. The team is seeking an experienced candidate to contribute new skills to our team, support team growth and foster AI development. The AI Team is a highly collaborative team with members in both the Cincinnati and London offices. This team supports many teams across the business including clinical operations, medical, labs, business development and business operations. The AI Team also works side-by-side with data engineering, business analytics and software engineering to architecture innovative data storage and access solutions for optimal data utilization strategies. If you are an individual with experience in informatics, data science, or computer science, please review the following career opportunity. Responsibilities * Explore and work with different data sources to collate into knowledge; * Work with different business teams across the company with a variety of different business needs to identify potential areas that AI can support; * Manage the process of working through AI potentials from discovery research to PoC to production with the business teams and supporting tasks for IT developers; * Try out different AI tools to substantiate the potential of its use with the business team; * Translate results into compelling visualizations which illustrate the overall benefits of the use of AI and identify with the business team the overall value of its use; * Develop and map database architecture of methodological and clinical data systems; * Convert business tasks into meaningful developer Jira tasks for sprints; * Support departmental process improvement initiatives that can include AI; and * Participate in training and development of more junior team members. Qualifications * Master's degree or higher in informatics, computer science/engineering, health information, statistics, or related field required; * 2 or more years of experience as a Data Scientist or closely related; * Experience applying machine learning to pharmaceutical or clinical data (or translatable artificial intelligence [ai] techniques from other industries); * Advanced computer programming skills (preferred language: Python); * Analytical thinker with great attention to detail; * Ability to prioritize multiple projects and tasks within tight timelines; and * Excellent written and verbal communication skills. Medpace Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Why Medpace? People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we've done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Cincinnati Perks * Cincinnati Campus Overview * Flexible work environment * Competitive PTO packages, starting at 20+ days * Competitive compensation and benefits package * Company-sponsored employee appreciation events * Employee health and wellness initiatives * Community involvement with local nonprofit organizations * Discounts on local sports games, fitness gyms and attractions * Modern, ecofriendly campus with an on-site fitness center * Structured career paths with opportunities for professional growth * Discounted tuition for UC online programs Awards * Named a Top Workplace in 2024 by The Cincinnati Enquirer * Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 * Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps.
    $69k-98k yearly est. Auto-Apply 9d ago
  • Data Scientist - Clinical and Operational Analytics

    Venesco LLC

    Data engineer job in Dayton, OH

    Requirements Mandatory Qualifications: • Bachelor's degree in a quantitative field (e.g., Computer Science, Applied Math). • 3+ years of experience in predictive analytics. • Proficiency in Python, NumPy, Pandas, Matplotlib, and Scikit-learn. • Ability to explain and implement ML algorithms from scratch. • Signed NDA and HIPAA training required upon start. Desired Qualifications: • Experience with dashboard development and pretrained language models. • Experience with dimensionality reduction and deep learning libraries (TensorFlow, PyTorch). • Familiarity with human biology and performance. Key Tasks and Responsibilities: • Develop and tune unsupervised tree-based clustering models. • Implement decision trees, k-NN, and optimized list sorting algorithms. • Generate and minimize distance matrices using vectorized code. • Collaborate with software engineers and maintain HIPAA compliance.
    $69k-95k yearly est. 60d+ ago
  • Data Scientist

    Core4Ce Careers

    Data engineer job in Dayton, OH

    We are seeking a highly skilled Data Scientist / Machine Learning Engineer to develop advanced analytics and machine learning solutions that drive meaningful insights for our customers. In this role, you will design and test algorithms, build data-driven experiments, and collaborate closely with SMEs and developers to transform data into actionable intelligence. This position is ideal for someone who excels at both innovative research and practical implementation. Key Responsibilities: Algorithm Development: Develop machine learning, data mining, statistical, and graph-based algorithms to analyze complex data sets and uncover meaningful patterns. Model Evaluation: Test, validate, and down-select algorithms to determine the best-performing models for customer requirements. Experimental Design & Data Generation: Design experiments and creating synthetic or simulated data when training/example data sets are limited or unavailable. Data Visualization & Reporting: Produce clear reports, dashboards, and visualizations that communicate data insights to customers and stakeholders in an intuitive manner. Automation & SME Collaboration: Work with subject matter experts to convert manual analytic workflows into efficient, automated analytics solutions. Cross-Functional Development: Collaborate with software developers to ensure algorithms are properly implemented, optimized, and integrated into production systems. *This position is designed to be flexible, with responsibilities evolving to meet business needs and enable individual growth. Required Qualifications: Active TS-SCI security clearance with the ability to obtain a CI poly. OPIR Experience Modeling and Simulation Experience Experience designing, training, and validating machine learning models and statistical algorithms. Proficiency with Python, R, or similar languages used for analytics and model development. Hands-on experience with data visualization tools (e.g., Tableau, Power BI, matplotlib, seaborn). Strong understanding of experimental design and data generation strategies. Ability to communicate complex analytic concepts to both technical and non-technical audiences. Demonstrated ability to work collaboratively across multidisciplinary teams. Degree in Mathematics/Statistics, Computer Science, or a relevant domain field. MA/MS degree with 13+ years of relevant experience, OR BA/BS degree with 15+ years of relevant experience in a discipline aligned with the position's responsibilities. Why Work for Us? Core4ce is a team of innovators, self-starters, and critical thinkers-driven by a shared mission to strengthen national security and advance warfighting outcomes. We offer: 401(k) with 100% company match on the first 6% deferred, with immediate vesting Comprehensive medical, dental, and vision coverage-employee portion paid 100% by Core4ce Unlimited access to training and certifications, with no pre-set cap on eligible professional development Tuition assistance for job-related degrees and courses Paid parental leave, PTO that grows with tenure, and generous holiday schedules Got a big idea? At Core4ce, The Forge gives every employee the chance to propose bold innovations and help bring them to life with internal backing. Join us to build a career that matters-supported by a company that invests in you. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex (including pregnancy), national origin, disability, veteran status, age, genetic information, or other legally protected status.
    $69k-95k yearly est. 7d ago
  • Data Engineer

    Total Quality Logistics, Inc. 4.0company rating

    Data engineer job in Cincinnati, OH

    Country USA State Ohio City Cincinnati Descriptions & requirements About the role: As a Data Engineer, with TQL you will be supporting the FP&A department by developing scalable reporting solutions in Microsoft Fabric. This role will focus on migrating data from on-premises systems to the cloud, building and optimizing SQL views and pipelines, and creating governed Power BI datasets and semantic models. What's in it for you: * $85,000-$125,000 base salary + performance bonuses * Advancement opportunities with aggressive and structured career paths * A culture of continuous education and technical training with reimbursements available * Comprehensive benefits package * Health, dental and vision coverage * 401(k) with company match * Perks including employee discounts, financial wellness planning, tuition reimbursement and more What you'll be doing: * Migrate FP&A datasets from on-premises to Microsoft Fabric/Lakehouse * Build and maintain SQL pipelines, transformations, and views that support reporting needs * Ensure performance, scalability, and reliability through automation, monitoring, and CI/CD best practices * Design, publish, and manage Power BI certified datasets, semantic models, and reports/dashboards * Apply best practices in DAX, modeling, and governance to enable accurate, self-service reporting * Partner with Finance stakeholders to translate reporting requirements into technical deliverables * Implement processes to ensure accuracy, consistency, and reconciliation across financial and operational systems * Maintain documentation of data models, business logic, and reporting standards * Troubleshoot and resolve issues impacting reporting accuracy or performance * Collaborate with Data Governance and Quality teams to align with enterprise standards and metadata frameworks What you need: * Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field * 3+ years of experience in BI/data engineering or analytics engineering * Advanced SQL skills with proven experience in building and optimizing large-scale datasets * Strong Power BI expertise (datasets, DAX, performance tuning, semantic models) * Hands-on experience with Microsoft Fabric and Lakehouse/cloud data platforms preferred * Knowledge of financial reporting concepts and ability to work with FP&A stakeholders p * Strong problem-solving skills and ability to bridge Finance and IT needs Where you'll be: 4289 Ivy Pointe Boulevard, Cincinnati, Ohio 45245 Employment visa sponsorship is unavailable for this position. Applicants requiring employment visa sponsorship now or in the future (e.g., F-1 STEM OPT, H-1B, TN, J1 etc.) will not be considered. About Us Total Quality Logistics (TQL) is one of the largest freight brokerage firms in the nation. TQL connects customers with truckload freight that needs to be moved with quality carriers who have the capacity to move it. As a company that operates 24/7/365, TQL manages work-life balance with sales support teams that assist with accounting, and after hours calls and specific needs. At TQL, the opportunities are endless which means that there is room for career advancement and the ability to write your own paycheck. What's your worth? Our open and transparent communication from management creates a successful work environment and custom career path for our employees. TQL is an industry-leader in the logistics industry with unlimited potential. Be a part of something big. Total Quality Logistics is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, age, national origin, genetic information, disability or protected veteran status. If you are unable to apply online due to a disability, contact recruiting at ****************** *
    $85k-125k yearly 36d ago
  • Data Engineer

    Tata Consulting Services 4.3company rating

    Data engineer job in Blue Ash, OH

    * Proven experience as a Software Developer, with a strong focus on building scalable and efficient Python applications. * Experience in developing Spark Structured Streaming applications is highly desirable. * Minimum of 7+ years of professional software development experience. * Strong analytical and problem-solving skills, with the ability to debug and optimize Spark jobs running on Databricks. * Ability to work closely with cross-functional teams to deliver high-quality streaming solutions. Technical Skills: * Strong expertise in Python, PySpark, and Spark Structured Streaming. * Experience with Databricks and Azure. * Familiarity with Delta Lake and Terraform scripting. * Proficiency in working with varied data file formats (Avro, JSON, CSV) for ingestion and transformation. Software Development: * Proficiency in Object-Oriented Programming (OOP) concepts and software design principles. * Ability to write clean, maintainable, and scalable Python code.GitHub Actions: * Experience in setting up and managing CI/CD pipelines using GitHub Actions to ensure smooth and automated deployment processes. Agile Methodology: * Experience working in an Agile/Scrum environment, with a focus on iterative development, continuous feedback, and delivery. Nice to Haves: * Python Unit Testing. * Unity Catalog. * Databricks Asset Bundles. * Unit Testing/Mocking TCS Employee Benefits Summary: * Discretionary Annual Incentive. * Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. * Family Support: Maternal & Parental Leaves. * Insurance Options: Auto & Home Insurance, Identity Theft Protection. * Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement. * Time Off: Vacation, Time Off, Sick Leave & Holidays. * Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing. #LI-RJ2 Salary Range-$100,000-$120,000 a year
    $100k-120k yearly 6d ago
  • Data Analysis Engineer

    Absolics Inc.

    Data engineer job in Covington, KY

    • Execute Quality Data Analysis * Administer Root Cause Analysis * Data Collection and Management * Data Analysis and Interpretation DUTIES/RESPONSIBILITIES * Excellent analytical and problem-solving skills * Solid understanding of quality management principles, root cause analysis, and corrective action processes REQUIREMENTS * Must be legally permitted to work in the United States * Proficiency in using quality management software and tools, as well as Microsoft Office applications * Problem-solving mindset and the ability to work well under pressure to meet deadlines * Strong analytical skills and attention to detail, with the ability to interpret data and trends to drive informed decisions QUALITIFICATIONS * Bachelor's degree related to Quality or similar industry EDUCATION * 5+ years of Quality experience * Experience with Machine Learning * Experience with Big Data Analysis & Automation * Experience with Yield Management System from SK Hynix/Samsung Semiconductor EXPERIENCE
    $71k-96k yearly est. 60d+ ago
  • Data Engineer

    Insight Global

    Data engineer job in Cincinnati, OH

    A large financial organization is seeking a Data Engineer that will sit onsite in Cincinnati Ohio for a contract role. This individual will be joining a project to help with a recent merger, and assisting with the migration and analysis of the data. The snowflake and ETL environment is already set up and you will be responsible for analyzing the data that is coming from the acquired company into this environment. To do so we are looking for someone with strong SQL and PowerBI experience who can pull this data from snowflake and maybe DB2 to then analyze. We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ********************.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: **************************************************** Skills and Requirements 5+ years of experience with Data analysis and Engineering 5+ years of SQL experience 2+ years of snowflake experience 2+ years of PowerBI experience 2+ Years of DB2 experience Finance/banking industry knowledge Merger/migration experience
    $75k-101k yearly est. 5d ago
  • Data Engineer I

    Stratacuity

    Data engineer job in Cincinnati, OH

    Squad: MLOps Data Enablement The MLOps Data Enablement squad supports model development by collaborating with data scientists to deliver high‑quality, production‑ready data. The team is responsible for preparing and refining datasets, integrating new data sources, and engineering features using modern tools such as dbt and Snowflake. This work enhances the modeling pipeline and enables efficient experimentation and deployment across the organization. Required Technical Skills * Strong proficiency in SQL * Foundational understanding of relational database concepts * Experience with data cleaning and data transformation Preferred Technical Skills * Snowflake * dbt * Python * Basic understanding of data science workflows Soft Skills * Strong problem‑solving abilities * Effective communication * Self‑motivated and proactive * Curiosity and willingness to learn * Positive and collaborative mindset Work Location Onsite four days per week (Monday-Thursday) at Fifth Third Center, Downtown Cincinnati, OH. Job Summary This role is responsible for designing and building scalable data management systems, ensuring data platforms meet organizational requirements, and identifying opportunities to enhance data acquisition and utilization. The position requires familiarity with industry practices related to data mining, algorithms, and data usage. Primary Responsibilities * Design, build, install, test, and maintain data management systems * Develop high‑performance algorithms, predictive models, and prototypes * Ensure systems align with business requirements and industry standards * Integrate emerging data management and software engineering technologies into existing environments * Establish processes for data mining, data modeling, and data production * Develop custom software components and analytics applications * Identify new opportunities for leveraging existing data * Utilize a variety of programming languages and tools to integrate systems * Collaborate with cross‑functional partners such as data architects, IT teams, and data scientists * Implement and maintain disaster recovery procedures * Recommend improvements to enhance data reliability and quality Qualifications * Technical degree or equivalent professional experience * Experience with relational and non‑relational databases (e.g., SQL, MySQL, NoSQL, Hadoop, MongoDB) * Experience programming or architecting backend systems (e.g., Java, J2EE) EEO Employer Apex Systems is an equal opportunity employer. We do not discriminate or allow discrimination on the basis of race, color, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related medical conditions), age, sexual orientation, gender identity, national origin, ancestry, citizenship, genetic information, registered domestic partner status, marital status, disability, status as a crime victim, protected veteran status, political affiliation, union membership, or any other characteristic protected by law. Apex will consider qualified applicants with criminal histories in a manner consistent with the requirements of applicable law. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation in using our website for a search or application, please contact our Employee Services Department at [email protected] or ************. Apex Systems is a world-class IT services company that serves thousands of clients across the globe. When you join Apex, you become part of a team that values innovation, collaboration, and continuous learning. We offer quality career resources, training, certifications, development opportunities, and a comprehensive benefits package. Our commitment to excellence is reflected in many awards, including ClearlyRated's Best of Staffing in Talent Satisfaction in the United States and Great Place to Work in the United Kingdom and Mexico. Apex uses a virtual recruiter as part of the application process. Click here for more details. Apex Benefits Overview: Apex offers a range of supplemental benefits, including medical, dental, vision, life, disability, and other insurance plans that offer an optional layer of financial protection. We offer an ESPP (employee stock purchase program) and a 401K program which allows you to contribute typically within 30 days of starting, with a company match after 12 months of tenure. Apex also offers a HSA (Health Savings Account on the HDHP plan), a SupportLinc Employee Assistance Program (EAP) with up to 8 free counseling sessions, a corporate discount savings program and other discounts. In terms of professional development, Apex hosts an on-demand training program, provides access to certification prep and a library of technical and leadership courses/books/seminars once you have 6+ months of tenure, and certification discounts and other perks to associations that include CompTIA and IIBA. Apex has a dedicated customer service team for our Consultants that can address questions around benefits and other resources, as well as a certified Career Coach. You can access a full list of our benefits, programs, support teams and resources within our 'Welcome Packet' as well, which an Apex team member can provide. Employee Type: Contract Location: Cincinnati, OH, US Job Type: Date Posted: January 13, 2026 Pay Range: $32 - $37 per hour Similar Jobs * Data Center Technician - Data Center Technician I * Data Center Technician - Data Center Technician I * Data Engineer * Data Engineer * Data Engineer
    $32-37 hourly 3d ago
  • Cloud Data Engineer

    Radiancetech

    Data engineer job in Beavercreek, OH

    Radiance Technologies, a rapidly growing employee-owned company supporting the Department of Defense, is searching for a Cloud Database Engineer to join our team. We are looking for a self-starter with excellent people skills to work with our customers. Employee ownership, generous 401K and profit sharing, competitive salaries, pleasant work environments, and stimulating assignments combine to make Radiance Technologies a great place to work and succeed. We are seeking a Cloud Data Engineer to design, implement, and manage seamless data and object storage solutions across on-premises, cloud, inter-region, and inter-cloud platforms. The Cloud Data Engineer will also be responsible for designing, implementing, and managing databases in our cloud environment, ensuring high levels of data availability and assisting in the development of data models. This role focuses on enabling secure, scalable, and resilient architectures that support structured and unstructured data, high-velocity pipelines, and multi-cloud ecosystems. The ideal candidate will have expertise in cloud-native data engineering, object storage platforms, and data transformation pipelines, with a strong understanding of managing data and storage across hybrid environments. This role requires close collaboration with platform, software, and analytics teams to ensure data and storage solutions align with mission-critical needs. Candidates should have an in-depth understanding of database structure principles and experience with cloud platforms such as AWS, Google Cloud, or Microsoft Azure. Database experience should include relational (Oracle, PostgreSQL, etc.), graph databases (Neo4j, ArangoDB, etc.), search and analytics engines like ElasticSearch, and other no SQL databases (MongoDB, Cassandra, etc.) Key Responsibilities: Seamless Data and Object Storage Management: Design, implement, and manage databases and object storage solutions across on-premises, cloud, inter-region, and inter-cloud platforms. Enable seamless data and object storage movement and synchronization between environments, ensuring high availability and minimal latency. Engineer and optimize object storage platforms (e.g., AWS S3, MinIO, Ceph) for durability, performance, lifecycle management, and secure access. Data and Storage Architecture Optimization: Build and maintain scalable data pipelines for structured and unstructured data, supporting ingestion, curation, metadata enrichment, and analytics workflows. Modernize storage architectures, including migration from legacy systems (e.g., NFS/file shares) to object-based architectures. Implement data tiering, record/object-level authorization, and secure access controls for both databases and object storage. Collaboration and Integration: Work closely with development teams to optimize database and object storage usage, integrating data services with containerized platforms (e.g., Kubernetes, OpenShift) and CI/CD pipelines. Partner with platform engineers, DevSecOps teams, and mission users to align data and storage solutions with operational needs. Governance and Security: Implement and enforce data governance policies, tagging, metadata schemas, and access controls for both databases and object storage. Develop and manage backup and restore procedures for databases and object storage, ensuring disaster recovery readiness. Monitor system performance and resolve database and object storage performance and capacity issues. Documentation and Troubleshooting: Document architectures, data flows, object storage configurations, and operational procedures to reduce tribal knowledge. Troubleshoot complex data, object storage, performance, and access issues across environments (DEV/INT/PROD). Required Qualifications: Bachelor's degree in Computer Science, Engineering, Data Science, or a related STEM field (Master's preferred). 5+ years of relevant work experience in data engineering, database administration, or object storage management. U.S Citizenship with the ability to obtain and maintain a DoD TS/SCI Security Clearance. Proven experience managing databases and object storage solutions across on-premises, cloud, inter-region, and inter-cloud platforms. Expertise in cloud platforms such as AWS, Google Cloud, or Microsoft Azure. Hands-on experience with object storage technologies (e.g., AWS S3, MinIO, Ceph) and data transformation pipelines (e.g., Apache Airflow, Argo Workflows, Python). Strong proficiency in Python, SQL, and modern data engineering frameworks. Familiarity with both relational (e.g., Oracle, PostgreSQL) and non-relational databases (e.g., MongoDB, Cassandra). Experience with data tiering, record/object-level authorization, and secure data handling. Ability to handle multiple projects and deadlines in a fast-paced environment. Desired Qualifications: Active TS/SCI clearance. Security+CE certification. Experience supporting Intelligence Community (IC) mission programs. Knowledge of data catalogs, search/indexing, and discovery tools. Experience with streaming data (e.g., Kafka) and event-driven architectures. Familiarity with DevSecOps, security scanning, and accreditation processes. Understanding of analytics, machine learning (ML), or exploitation workflows consuming large datasets. Prior experience modernizing or migrating legacy data systems. EOE/Minorities/Females/Vet/Disabled
    $75k-102k yearly est. Auto-Apply 9d ago
  • Senior Data Engineer

    Apidel Technologies 4.1company rating

    Data engineer job in Blue Ash, OH

    Job Description The Engineer is responsible for staying on track with key milestones in Customer Platform / Customer Data Acceleration, work will be on the new Customer Platform Analytics system in Databricks. The Engineer has overall responsibility in the technical design process. Leads and participates in the application technical design process and completes estimates and work plans for design, development, implementation, and rollout tasks. The Engineer also communicates with the appropriate teams to ensure that assignments are delivered with the highest of quality and in accordance to standards. The Engineer strives to continuously improve the software delivery processes and practices. Role model and demonstrate the companys core values of respect, honesty, integrity, diversity, inclusion and safety of others. Current tools and technologies include: Databricks and Netezza Key Responsibilities Lead and participate in the design and implementation of large and/or architecturally significant applications. Champion company standards and best practices. Work to continuously improve software delivery processes and practices. Build partnerships across the application, business and infrastructure teams. Setting up new customer data platforms from Netezza to Databricks Complete estimates and work plans independently as appropriate for design, development, implementation and rollout tasks. Communicate with the appropriate teams to ensure that assignments are managed appropriately and that completed assignments are of the highest quality. Support and maintain applications utilizing required tools and technologies. May direct the day-to-day work activities of other team members. Must be able to perform the essential functions of this position with or without reasonable accommodation. Work quickly with the team to implement new platform. Be onsite with development team when necessary. Behaviors/Skills: Puts the Customer First - Anticipates customer needs, champions for the customer, acts with customers in mind, exceeds customers expectations, gains customers trust and respect. Communicates effectively and candidly - Communicates clearly and directly, approachable, relates well to others, engages people and helps them understand change, provides and seeks feedback, articulates clearly, actively listens. Achieves results through teamwork Is open to diverse ideas, works inclusively and collaboratively, holds self and others accountable, involves others to accomplish individual and team goals Note to Vendors Length of Contract 9 months Top skills Databricks, Netezza Soft Skills Needed collaborating well with others, working in a team dynamic Project person will be supporting - staying on track with key milestones in Customer Platform / Customer Data Acceleration, Work will be on the new Customer Platform Analytics system in Databricks that will replace Netezza Team details ie. size, dynamics, locations most of the team is located in Cincinnati, working onsite at the BTD Work Location (in office, hybrid, remote) Onsite at BTD when necessary, approximately 2-3 days a week Is travel required - No Max Rate if applicable best market rate Required Working Hours 8-5 est Interview process and when will it start Starting with one interview, process may change Prescreening Details standard questions. Scores will carry over. When do you want this person to start Looking to hire quickly, the team is looking to move fast.
    $79k-114k yearly est. 6d ago
  • Senior Data Engineer

    General Electric Credit Union 4.8company rating

    Data engineer job in Cincinnati, OH

    General Electric Credit Union is a not-for-profit, member-owned full service financial institution headquartered in Cincinnati with branches in Ohio and Kentucky. At GECU, we pride ourselves on maintaining quality service, being an employee-friendly workplace, and developing our team members while teaching you the skills to lead you to career advancement opportunities. Overview: The Senior Data Engineer will play a key role in developing and optimizing GECU's data infrastructure to support the organization's data-driven initiatives. The Senior Data Engineer will be designing, building, and maintaining scalable data pipelines and systems, working with the data and development team Essential Responsibilities: Design, implement, and maintain robust, scalable, and high-performance data pipelines and ETL processes to collect, process, and store large volumes of structured and unstructured data. Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions. Develop and maintain data warehouse and data lake solutions, ensuring data quality, integrity, and reliability. Optimize data pipelines and ETL processes for performance, efficiency, and cost-effectiveness, utilizing best practices and technologies. Implement data governance and security measures to ensure compliance with regulatory requirements and data privacy standards. Troubleshoot and resolve issues related to data processing, data quality, and system performance in a timely manner. Evaluate and recommend new technologies, tools, and frameworks to enhance the organization's data infrastructure and capabilities. Document technical specifications, data lineage, and system architecture to facilitate knowledge sharing and collaboration. Collaborate with other key data employees to maintain and publish data definitions and data catalogue. Stay up to date with industry trends and emerging technologies in data engineering and analytics. Education and Experience: High school diploma, or GED required; Bachelor's degree in Computer Science, Engineering, or related field; Master's degree is a plus. Minimum 6 years' experience in Data Engineering; working with data warehousing concepts, database technologies (e.g., SQL, NoSQL), and distributed computing architectures. Experience with Snowflake Data Warehouse preferred Knowledge, Skills, and Abilities: Strong programming skills in languages such as Python, Java, Scala, or SQL, with experience in data manipulation, transformation, and analysis. Knowledge of cloud platforms such as AWS, Azure, or Google Cloud Platform. Extensive knowledge of data modeling, schema design, and optimization techniques for relational and non-relational databases. Excellent problem-solving and troubleshooting skills, with the ability to diagnose and resolve complex technical issues. Strong communication and collaboration skills, with the ability to work effectively in a team environment and interact with stakeholders at all levels. Ability to perform independently and competently to accomplish necessary deliverables accurately and on time. Ability to assist and mentor Junior Data Engineers At GECU, we want to support your wellbeing by offering a wide range of benefits: Health, Dental and Vision insurance Life and Disability insurance options Paid Time Off starts accruing once hired and take your birthday off -paid 401k Retirement plan with up to a 10% match of your base gross compensation Tuition reimbursement opportunities & professional development Volunteer opportunities -and earn additional PTO hours! On-site clinics for Vaccines and Mammograms And many more! Come join GECU as we are a curated culture of respect, understanding, and mutual recognition. We believe forming bonds and connecting with each other only stands to strengthen the service we provide to our members in our mission of improving the Quality of Financial lives! General Electric Credit Union is an Equal Opportunity Employer
    $77k-101k yearly est. 60d+ ago
  • AI Engineer II

    Caresource 4.9company rating

    Data engineer job in Dayton, OH

    The AI Engineer II is responsible for software development related to generative AI solutions. The role will work closely with cross-functional teams to ensure the successful delivery of solutions within the defined scope, timeline, and quality standards. Essential Functions: Define and oversee the architecture for Generative AI platforms, including LLMs, vector databases, and inference pipelines. Design and develop complex software systems, ensuring scalability, reliability, and maintainability. Develop and maintain a strong understanding of modern generative AI tools and concepts including OpenAI, Llama, Python, LangChain, vectorization, embeddings, semantic search, RAG, IaC, and Streamlit. Rapid development of proof of concepts to evaluate new technology or value adding concepts. Drive innovation and teamwork in a fast-paced, dynamic environment through a hands-on, imaginative approach and a self-motivated, curious mindset that explores the art of the possible. Embrace AI assisted software development including the use of GitHub Copilot and internally developed tools. Collaborate with leadership to systematically evaluate currently deployed services; develop and manage plans to optimize delivery and support mechanism. Apply creative thinking in problem solving and identifying opportunities for improvement. Identify technical risks and propose effective mitigation strategies to ensure project success. Collaborate with product managers to prioritize and schedule project deliverables based on business objectives and resource availability. Provide accurate and timely progress updates to project stakeholders, highlighting achievements, challenges, and proposed solutions. Stay up to date with the latest industry trends, technologies, and frameworks, and evaluate their potential application in the organization. Perform any other job related duties as requested. Education and Experience: Bachelor's degree in computer science, Information Technology, or a related field required Equivalent years of relevant work experience may be accepted in lieu of required education Three (3) years of experience working in medium to large operating environment required Experience with Agile methodologies required Experience with cloud technologies including containers and serverless preferred Competencies, Knowledge and Skills: Strong analytical, evaluative and problem-solving abilities Knowledge of healthcare and managed care is preferred Critical listening and thinking skills Strong knowledge of best practices relative to application development or infrastructure standards Licensure and Certification: Cloud certification preferred Working Conditions: General office environment; may be required to sit or stand for extended periods of time Travel is not typically required Compensation Range: $83,000.00 - $132,800.00 CareSource takes into consideration a combination of a candidate's education, training, and experience as well as the position's scope and complexity, the discretion and latitude required for the role, and other external and internal data when establishing a salary level. In addition to base compensation, you may qualify for a bonus tied to company and individual performance. We are highly invested in every employee's total well-being and offer a substantial and comprehensive total rewards package. Compensation Type: Salary Competencies: - Fostering a Collaborative Workplace Culture - Cultivate Partnerships - Develop Self and Others - Drive Execution - Influence Others - Pursue Personal Excellence - Understand the Business This is not all inclusive. CareSource reserves the right to amend this job description at any time. CareSource is an Equal Opportunity Employer. We are dedicated to fostering an environment of belonging that welcomes and supports individuals of all backgrounds. #LI-GM1
    $83k-132.8k yearly 1d ago
  • ETL Architect

    Scadea Solutions

    Data engineer job in Cincinnati, OH

    Job title: ETL Architect DURATION 18 months YEARS OF EXPERIENCE 7-10 INTERVIEW TYPE Phone Screen to Hire REQUIRED SKILLS • Experience with Data Stage and ETL design Technical • Requirement gathering , converting business requirements to technical specs to profile • Worked hands on in minimum 2 projects with data stage • Understand the process of developing an etl design that support multiple datastage developers • Be able to create an etl design framework and related specifications for use by etl developers • Define standards and best practices of Data Stage etl to be followed by all data stage developers • Understanding of Data Warehouse, Data marts concepts and implementation experience • Be able to look at code produced to insure conformance with developed ETL framework and design for reuse • Preferable experienced user level comptency in IBM's metadata product, datastage and Infosphere product line • Be able to design etl for oracle or sql server or any db • Good analytical skills and process design • Insuring compliance to quality standards, and delivery timelines. Qualifications Bachelors Additional Information Required Skills: Job Description: Performs highly complex application programming/systems development and support Performs highly complex configuration of business rules and technical parameters of software products Review business requirements and develop application design documentation Build technical components (Maximo objects, TRM Rules, Java extensions, etc) based on detailed design. Performs unit testing of components along with completing necessary documentation. Supports product test, user acceptance test, etc as a member of the fix-it team. Employs consistent measurement techniques Include testing in project plans and establish controls to require adherence to test plans Manages the interrelationships among various projects or work objectives
    $86k-113k yearly est. 60d+ ago
  • Data Scientist - Clinical and Operational Analytics

    Venesco LLC

    Data engineer job in Dayton, OH

    Job DescriptionDescription: Develop and deploy machine learning models to support clinical and operational decision-making. Work with large datasets to extract insights and support predictive analytics for human performance and health. Requirements: Mandatory Qualifications: • Bachelor's degree in a quantitative field (e.g., Computer Science, Applied Math). • 3+ years of experience in predictive analytics. • Proficiency in Python, NumPy, Pandas, Matplotlib, and Scikit-learn. • Ability to explain and implement ML algorithms from scratch. • Signed NDA and HIPAA training required upon start. Desired Qualifications: • Experience with dashboard development and pretrained language models. • Experience with dimensionality reduction and deep learning libraries (TensorFlow, PyTorch). • Familiarity with human biology and performance. Key Tasks and Responsibilities: • Develop and tune unsupervised tree-based clustering models. • Implement decision trees, k-NN, and optimized list sorting algorithms. • Generate and minimize distance matrices using vectorized code. • Collaborate with software engineers and maintain HIPAA compliance.
    $69k-95k yearly est. 9d ago
  • Data Engineer

    Insight Global

    Data engineer job in Cincinnati, OH

    Insight Global is looking for a data engineer contractor for one of their top financial clients. The following would be their roles and responsibilities: - Bachelor's degree in Computer Science/Information Systems or equivalent combination of education and experience. - Must be able to communicate ideas both verbally and in writing to management, business and IT sponsors, and technical resources in language that is appropriate for each group. - Four+ years of relevant IT experience in data engineering or related disciplines. - Significant experience with at least one major relational database management system (RDBMS). - Experience working with and supporting Unix/Linux and Windows systems. - Proficiency in relational database modeling concepts and techniques. - Solid conceptual understanding of distributed computing principles and scalable data architectures. - Working knowledge of application and data security concepts, best practices, and common vulnerabilities. - Experience in one or more of the following disciplines preferred: scalable data platforms and modern data architectures technologies and distributions, metadata management products, commercial ETL tools, data reporting and visualization tools, messaging systems, data warehousing, major version control systems, continuous integration/delivery tools, infrastructure automation and virtualization tools, major cloud platforms (AWS, Azure, GCP), or rest API design and development. - Previous experience working with offshore teams desired. - Financial industry experience, especially Regulatory Reporting, is a plus. We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ********************.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: **************************************************** Skills and Requirements 4+ years in Business Intelligence - Data Engineering Experience working within Data Stage Experience working in DBT transformation framework Experience working in Cloud Native data platforms, specifically Snowflake Experience with SQL for interacting with relational databases Regulatory Reporting experience
    $75k-101k yearly est. 60d+ ago
  • Data Engineer

    Medpace 4.5company rating

    Data engineer job in Cincinnati, OH

    Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Data Engineer to join our Information Technology team. This position will work on a team to accomplish tasks and projects that are instrumental to the company's success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you. Responsibilities * Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); * Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); * Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; * Collect, analyze and document user requirements; * Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; * Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; * Communicate with team members regarding projects, development, tools, and procedures; and * Provide end-user support including setup, installation, and maintenance for applications Qualifications * Bachelor's Degree in Computer Science, Data Science, or a related field; * 3+ years of experience in Data Engineering; * Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; * Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; * Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; * Knowledge of Python is preferred; * Knowledge of REST API; * Basic knowledge of SQL Server databases is required; * Knowledge of C#, Azure development is a bonus; and * Excellent analytical, written and oral communication skills. Medpace Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Why Medpace? People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we've done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Cincinnati Perks * Cincinnati Campus Overview * Flexible work environment * Competitive PTO packages, starting at 20+ days * Competitive compensation and benefits package * Company-sponsored employee appreciation events * Employee health and wellness initiatives * Community involvement with local nonprofit organizations * Discounts on local sports games, fitness gyms and attractions * Modern, ecofriendly campus with an on-site fitness center * Structured career paths with opportunities for professional growth * Discounted tuition for UC online programs Awards * Named a Top Workplace in 2024 by The Cincinnati Enquirer * Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 * Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps.
    $80k-111k yearly est. Auto-Apply 7d ago
  • Go Anywhere SFTP Data Engineer

    Tata Consulting Services 4.3company rating

    Data engineer job in Cincinnati, OH

    * Maintain robust data pipelines for ingesting and processing data from various sources, including SFTP servers. * Implement and manage automated SFTP data transfers, ensuring data security, integrity, and timely delivery. * Configure and troubleshoot SFTP connections, including handling authentication, key management, and directory structures. * Develop and maintain scripts or tools for automating SFTP-related tasks, such as file monitoring, error handling, and data validation. * Collaborate with external teams and vendors to establish and maintain secure SFTP connections for data exchange. * Ensure compliance with data security and governance policies related to SFTP transfers. * Monitor and optimize SFTP performance, addressing any bottlenecks or issues. * Document SFTP integration processes, configurations, and best practices. * Responsible for providing monthly SOC controls. * Experience with Solimar software. * Responsible for period software updates and patching. * Manage open incidents. * Responsible for after-hours and weekends on-call duties * Minimum (3-5) years related work experience * Experience with Microsoft Software and associated server tools. * Experience with GoAnywhere managed file transfer (MFT) solution. * Experience with WinSCP * Experience with Azure Cloud * Proven experience in data engineering, with a strong emphasis on data ingestion and pipeline development. * Demonstrated expertise in working with SFTP for data transfer and integration. * Proficiency in scripting languages (e.g., Python, Shell) for automating SFTP tasks. * Familiarity with various SFTP clients, servers, and related security protocols. * Understanding of data security best practices, including encryption and access control for SFTP. * Experience with cloud platforms (e.g., AWS, Azure, GCP) and their SFTP integration capabilities is a plus. * Strong problem-solving and troubleshooting skills related to data transfer and integration issues. Salary Range- $80,000-$85,000 a year #LI-SP3 #LI-VX1
    $80k-85k yearly 15d ago

Learn more about data engineer jobs

How much does a data engineer earn in Cincinnati, OH?

The average data engineer in Cincinnati, OH earns between $66,000 and $115,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Cincinnati, OH

$87,000

What are the biggest employers of Data Engineers in Cincinnati, OH?

The biggest employers of Data Engineers in Cincinnati, OH are:
  1. Ernst & Young
  2. 84.51
  3. ETEK International
  4. ComResource
  5. Medpace
  6. Insight Global
  7. Stratacuity
  8. Tata Group
  9. Matlen Silver
  10. Accenture
Job type you want
Full Time
Part Time
Internship
Temporary