Do you enjoy solving billion-dollar data science problems across trillions of data points? Are you passionate about working at the cutting edge of interdisciplinary boundaries, where computer science meets hard science? If you like turning untidy data into nonobvious insights and surprising business leaders with the transformative power of Artificial Intelligence (AI), including Generative and Agentic AI, we want you on our team at P&G.
As a Data Scientist in our organization, you will play a crucial role in disrupting current business practices by designing and implementing innovative models that enhance our processes. You will be expected to constructively research, design, and customize algorithms tailored to various problems and data types. Utilizing your expertise in Operations Research (including optimization and simulation) and machine learning models (such as tree models, deep learning, and reinforcement learning), you will directly contribute to the development of scalable Data Science algorithms. Your work will also integrate advanced techniques from Generative and Agentic AI to create more dynamic and responsive models, enhancing our analytical capabilities. You will collaborate with Data and AI Engineering teams to productionize these solutions, applying exploratory data analysis, feature engineering, and model building within cloud environments on massive datasets to deliver accurate and impactful insights. Additionally, you will mentor others as a technical coach and become a recognized expert in one or more Data Science techniques, quantifying the improvements in business outcomes resulting from your work.
Key Responsibilities:
+ Algorithm Design & Development: Directly contribute to the design and development of scalable Data Science algorithms.
+ Collaboration: Work closely with Data and Software Engineering teams to effectively productionize algorithms.
+ Data Analysis: Apply thorough technical knowledge to large datasets, conducting exploratory data analysis, feature engineering, and model building.
+ Coaching & Mentorship: Develop others as a technical coach, sharing your expertise and insights.
+ Expertise Development: Become a known expert in one or multiple Data Science techniques and methodologies.
Job Qualifications
Required Qualifications:
+ Education: Pursuing or has graduated with a Master's degree in a quantitative field (Operations Research, Computer Science, Engineering, Applied Mathematics, Statistics, Physics, Analytics, etc.) or possess equivalent work experience.
+ Technical Skills: Proficient in programming languages such as Python and familiar with data science/machine learning libraries like OpenCV, scikit-learn, PyTorch, TensorFlow/Keras, and Pandas. Demonstrated ability to develop and test code within cloud environments.
+ Communication: Strong written and verbal communication skills, with the ability to influence others to take action.
Preferred Qualifications:
+ Analytic Methodologies: Experience applying analytic methodologies such as Machine Learning, Optimization, Simulation, and Generative and Agentic AI to real-world problems.
+ Continuous Learning: A commitment to lifelong learning, keeping up to date with the latest technology trends, and a willingness to teach others while learning new techniques.
+ Data Handling & Cloud: Experience with large datasets and developing in cloud computing platforms such as GCP or Azure.
+ DevOps Familiarity: Familiarity with DevOps environments, including tools like Git and CI/CD practices.
Immigration Sponsorship is not available for this role. For more information regarding who is eligible for hire at P&G along with other work authorization FAQ's, please click HERE (******************************************************* .
Procter & Gamble participates in e-verify as required by law.
Qualified individuals will not be disadvantaged based on being unemployed.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Job Schedule
Full time
Job Number
R000135859
Job Segmentation
Entry Level
Starting Pay / Salary Range
$85,000.00 - $115,000.00 / year
$85k-115k yearly 60d+ ago
Looking for a job?
Let Zippia find it for you.
DATA SCIENTIST
Department of The Air Force
Data engineer job in Wright-Patterson Air Force Base, OH
The PALACE Acquire Program offers you a permanent position upon completion of your formal training plan. As a Palace Acquire Intern you will experience both personal and professional growth while dealing effectively and ethically with change, complexity, and problem solving. The program offers a 3-year formal training plan with yearly salary increases. Promotions and salary increases are based upon your successful performance and supervisory approval.
Summary
The PALACE Acquire Program offers you a permanent position upon completion of your formal training plan. As a Palace Acquire Intern you will experience both personal and professional growth while dealing effectively and ethically with change, complexity, and problem solving. The program offers a 3-year formal training plan with yearly salary increases. Promotions and salary increases are based upon your successful performance and supervisory approval.
Overview
Help
Accepting applications
Open & closing dates
09/29/2025 to 09/28/2026
Salary $49,960 to - $99,314 per year
Total salary varies depending on location of position
Pay scale & grade GS 7 - 9
Locations
Gunter AFB, AL
Few vacancies
Maxwell AFB, AL
Few vacancies
Davis Monthan AFB, AZ
Few vacancies
Edwards AFB, CA
Few vacancies
Show morefewer locations (44)
Los Angeles, CA
Few vacancies
Travis AFB, CA
Few vacancies
Vandenberg AFB, CA
Few vacancies
Air Force Academy, CO
Few vacancies
Buckley AFB, CO
Few vacancies
Cheyenne Mountain AFB, CO
Few vacancies
Peterson AFB, CO
Few vacancies
Schriever AFB, CO
Few vacancies
Joint Base Anacostia-Bolling, DC
Few vacancies
Cape Canaveral AFS, FL
Few vacancies
Eglin AFB, FL
Few vacancies
Hurlburt Field, FL
Few vacancies
MacDill AFB, FL
Few vacancies
Patrick AFB, FL
Few vacancies
Tyndall AFB, FL
Few vacancies
Robins AFB, GA
Few vacancies
Hickam AFB, HI
Few vacancies
Barksdale AFB, LA
Few vacancies
Hanscom AFB, MA
Few vacancies
Natick, MA
Few vacancies
Aberdeen Proving Ground, MD
Few vacancies
Andrews AFB, MD
Few vacancies
White Oak, MD
Few vacancies
Offutt AFB, NE
Few vacancies
Holloman AFB, NM
Few vacancies
Kirtland AFB, NM
Few vacancies
Nellis AFB, NV
Few vacancies
Rome, NY
Few vacancies
Heath, OH
Few vacancies
Wright-Patterson AFB, OH
Few vacancies
Tinker AFB, OK
Few vacancies
Arnold AFB, TN
Few vacancies
Dyess AFB, TX
Few vacancies
Fort Sam Houston, TX
Few vacancies
Goodfellow AFB, TX
Few vacancies
Lackland AFB, TX
Few vacancies
Randolph AFB, TX
Few vacancies
Hill AFB, UT
Few vacancies
Arlington, VA
Few vacancies
Dahlgren, VA
Few vacancies
Langley AFB, VA
Few vacancies
Pentagon, Arlington, VA
Few vacancies
Fairchild AFB, WA
Few vacancies
Warren AFB, WY
Few vacancies
Remote job No Telework eligible No Travel Required Occasional travel - You may be expected to travel for this position. Relocation expenses reimbursed No Appointment type Internships Work schedule Full-time Service Competitive
Promotion potential
13
Job family (Series)
* 1560 Data Science Series
Supervisory status No Security clearance Secret Drug test No Position sensitivity and risk Noncritical-Sensitive (NCS)/Moderate Risk
Trust determination process
* Suitability/Fitness
Financial disclosure No Bargaining unit status No
Announcement number K-26-DHA-12804858-AKK Control number 846709300
This job is open to
Help
The public
U.S. Citizens, Nationals or those who owe allegiance to the U.S.
Students
Current students enrolled in an accredited high school, college or graduate institution.
Recent graduates
Individuals who have graduated from an accredited educational institute or certificate program within the last 2 years or 6 years for Veterans.
Clarification from the agency
This public notice is to gather applications that may or may not result in a referral or selection.
Duties
Help
1. Performs developmental assignments in support of projects assigned to higher level analysts. Performs minor phases of a larger assignment or work of moderate difficulty where procedures are established, and a number of specific guidelines exist. Applies the various steps of accepted data science procedures to search for information and perform well precedented work.
2. Performs general operations and assignments for portions of a project or study consisting of a series of interrelated tasks or problems. The employee applies judgment in the independent application of methods and techniques previously learned. The employee locates and selects the most appropriate guidelines and modifies to address unusual situations.
3. Participates in special initiatives, studies, and projects. Performs special research tasks designed to utilize and enhance knowledge of work processes and techniques. Works with higher graded specialists in planning and conducting special initiatives, studies, and projects. Assists in preparing reports and briefings outlining study findings and recommendations.
4. Prepares correspondence and other documentation. Drafts or prepares a variety of documents to include newsletter items,
responses to routine inquiries, reports, letters, and other related documents.
Requirements
Help
Conditions of employment
* Employee must maintain current certifications
* Successful completion of all training and regulatory requirements as identified in the applicable training plan
* Must meet suitability for federal employment
* Direct Deposit: All federal employees are required to have direct deposit
* Please read this Public Notice in its entirety prior to submitting your application for consideration.
* Males must be registered for Selective Service, see ***********
* A security clearance may be required. This position may require a secret, top-secret or special sensitive clearance.
* If authorized, PCS will be paid IAW JTR and AF Regulations. If receiving an authorized PCS, you may be subject to completing/signing a CONUS agreement. More information on PCS requirements, may be found at: *****************************************
* More information on PCS requirements, may be found at: *****************************************
* Position may be subject to random drug testing
* U.S. Citizenship Required
* Disclosure of Political Appointments
* Student Loan Repayment may be authorized
* Recruitment Incentive may be authorized for this position
* Total salary varies depending on location of position
* You will be required to serve a one year probationary period
* Grade Point Average - 2.95 or higher out of a possible 4.0
* Mobility - you may be required to relocate during or after completion of your training
* Work may occasionally require travel away from the normal duty station on military or commercial aircraft
Qualifications
BASIC REQUIREMENT OR INDIVIDUAL OCCUPATIONAL REQUIREMENT:
Degree: Mathematics, statistics, computer science, data science or field directly related to the position. The degree must be in a major field of study (at least at the baccalaureate level) that is appropriate for the position.
You may qualify if you meet one of the following:
1. GS-7: You must have completed or will complete a 4-year course of study leading to a bachelor's from an accredited institution AND must have documented Superior Academic Achievement (SAA) at the undergraduate level in the following:
a) Grade Point Average 2.95 or higher out of a possible 4.0 as recorded on your official transcript or as computed based on 4 years of education or as computed based on courses completed during the final 2 years of curriculum; OR 3.45 or higher out of a possible 4.0 based on the average of the required courses completed in your major field or the required courses in your major field completed during the final 2 years of your curriculum.
2. GS-9: You must have completed 2 years of progressively higher-level graduate education leading to a master's degree or equivalent graduate degree:
a) Grade Point Average - 2.95 or higher out of a possible 4.0 as recorded on your official transcript or as computed based on 4 years of education or as computed based on courses completed during the final 2 years of curriculum; OR 3.45 or higher out of a possible 4.0 based on the average of the required courses completed in your major field or the required courses in your major field completed during the final 2 years of your curriculum. If more than 10 percent of total undergraduate credit hours are non-graded, i.e. pass/fail, CLEP, CCAF, DANTES, military credit, etc. you cannot qualify based on GPA.
KNOWLEDGE, SKILLS AND ABILITIES (KSAs): Your qualifications will be evaluated on the basis of your level of knowledge, skills, abilities and/or competencies in the following areas:
1. Professional knowledge of basic principles, concepts, and practices of data science to apply scientific methods and techniques to analyze systems, processes, and/or operational problems and procedures.
2. Knowledge of mathematics and analysis to perform minor phases of a larger assignment and prepare reports, documentation, and correspondence to communicate factual and procedural information clearly.
3. Skill in applying basic principles, concepts, and practices of the occupation sufficient to perform routine to difficult but well precedented assignments in data science analysis.
4. Ability to analyze, interpret, and apply data science rules and procedures in a variety of situations and recommend solutions to senior analysts.
5. Ability to analyze problems to identify significant factors, gather pertinent data, and recognize solutions.
6. Ability to plan and organize work and confer with co-workers effectively.
PART-TIME OR UNPAID EXPERIENCE: Credit will be given for appropriate unpaid and or part-time work. You must clearly identify the duties and responsibilities in each position held and the total number of hours per week.
VOLUNTEER WORK EXPERIENCE: Refers to paid and unpaid experience, including volunteer work done through National Service Programs (i.e., Peace Corps, AmeriCorps) and other organizations (e.g., professional; philanthropic; religious; spiritual; community; student and social). Volunteer work helps build critical competencies, knowledge and skills that can provide valuable training and experience that translates directly to paid employment. You will receive credit for all qualifying experience, including volunteer experience.
Education
IF USING EDUCATION TO QUALIFY: If position has a positive degree requirement or education forms the basis for qualifications, you MUST submit transcriptswith the application. Official transcripts are not required at the time of application; however, if position has a positive degree requirement, qualifying based on education alone or in combination with experience, transcripts must be verified prior to appointment. An accrediting institution recognized by the U.S. Department of Education must accredit education. Click here to check accreditation.
FOREIGN EDUCATION: Education completed in foreign colleges or universities may be used to meet the requirements. You must show proof the education credentials have been deemed to be at least equivalent to that gained in conventional U.S. education program. It is your responsibility to provide such evidence when applying.
Additional information
For DHA Positions:
These positions are being filled under Direct-Hire Authority for the Department of Defense for Post-Secondary Students and Recent Graduates. The Secretary of the Air Force has delegated authority by the Office of the Secretary of Defense to directly appoint qualified post-secondary students and recent graduates directly into competitive service positions; these positions may be professional or administrative occupations and are located Air Force-Wide. Positions may be filled as permanent or term with a full-time or part-time work schedule. Pay will vary by geographic location.
* The term "Current post-secondary student" means a person who is currently enrolled in, and in good academic standing at a full-time program at an institution of higher education; and is making satisfactory progress toward receipt of a baccalaureate or graduate degree; and has completed at least one year of the program.
* The term "recent graduate" means a person who was awarded a degree by an institution of higher education not more than two years before the date of the appointment of such person, except in the case of a person who has completed a period of obligated service in a uniform service of more than four years.
Selective Service: Males born after 12-31-59 must be registered or exempt from Selective Service. For additional information, click here.
Direct Deposit: All federal employees are required to have direct deposit.
If you are unable to apply online, view the following link for information regarding Alternate Application. The Vacancy ID is
If you have questions regarding this announcement and have hearing or speech difficulties click here.
Tax Law Impact for PCS: On 22-Dec-2017, Public Law 115-97 - the "Tax Cuts and Jobs Act of 2017" suspended qualified moving expense deductions along with the exclusion for employer reimbursements and payments of moving expenses effective 01-Jan-2018 for tax years 2018 through 2025. The law made taxable certain reimbursements and other payments, including driving mileage, airfare and lodging expenses, en-route travel to the new duty station, and temporary storage of those items. The Federal Travel Regulation Bulletin (FTR) 18-05 issued by General Services Administration (GSA) has authorized agencies to use the Withholding Tax Allowance (WTA) and Relocation Income Tax Allowance (RITA) to pay for "substantially all" of the increased tax liability resulting from the "2018 Tax Cuts and Jobs Act" for certain eligible individuals. For additional information on WTA/RITA allowances and eligibilities please click here. Subsequently, FTR Bulletin 20-04 issued by GSA, provides further information regarding NDAA FY2020, Public Law 116-92, and the expansion of eligibility beyond "transferred" for WTA/RITA allowances. For additional information, please click here.
Expand Hide additional information
Candidates should be committed to improving the efficiency of the Federal government, passionate about the ideals of our American republic, and committed to upholding the rule of law and the United States Constitution.
Benefits
Help
A career with the U.S. government provides employees with a comprehensive benefits package. As a federal employee, you and your family will have access to a range of benefits that are designed to make your federal career very rewarding. Opens in a new window Learn more about federal benefits.
Review our benefits
Eligibility for benefits depends on the type of position you hold and whether your position is full-time, part-time or intermittent. Contact the hiring agency for more information on the specific benefits offered.
How you will be evaluated
You will be evaluated for this job based on how well you meet the qualifications above.
For DHA Positions:
These positions are being filled under Direct-Hire Authority for the DoD for Post-Secondary Students and Recent Graduates. The Secretary of the Air Force has delegated authority by the Office of the Secretary of Defense to directly appoint qualified students and recent graduates directly into competitive service positions; positions may be professional or administrative occupations and located Air Force-Wide. Positions may be filled as permanent/term with a full-time/part-time work schedule. Pay will vary by geographic location.
* The term "Current post-secondary student" means a person who is currently enrolled and in good academic standing at a full-time program at an institution of higher education; and is progressing toward a baccalaureate or graduate degree; and has completed at least 1 year of the program.
* The term "recent graduate" means a person awarded a degree by an institution of higher education not more than 2 years before the date of the appointment of such person, except in the case of a person who has completed a period of obligated service in a uniform service of more than 4 years.
Your latest resume will be used to determine your qualifications.
Your application package (resume, supporting documents, and responses to the questionnaire) will be used to determine your eligibility, qualifications, and quality ranking for this position. Please follow all instructions carefully. Errors or omissions may affect your rating or consideration for employment.
Your responses to the questionnaire may be compared to the documents you submit. The documents you submit must support your responses to the online questionnaire. If your application contradicts or does not support your questionnaire responses, you will receive a rating of "not qualified" or "insufficient information" and you will not receive further consideration for this job.
Applicants who disqualify themselves will not be evaluated further.
Benefits
Help
A career with the U.S. government provides employees with a comprehensive benefits package. As a federal employee, you and your family will have access to a range of benefits that are designed to make your federal career very rewarding. Opens in a new window Learn more about federal benefits.
Review our benefits
Eligibility for benefits depends on the type of position you hold and whether your position is full-time, part-time or intermittent. Contact the hiring agency for more information on the specific benefits offered.
Required documents
Required Documents
Help
The following documents are required and must be provided with your application for this Public Notice. Applicants who do not submit required documentation to determine eligibility and qualifications will be eliminated from consideration. Other documents may be required based on the eligibility/eligibilities you are claiming. Click here to view the AF Civilian Employment Eligibility Guide and the required documents you must submit to substantiate the eligibilities you are claiming.
* Online Application - Questionnaire
* Resume: Your resume may NOT exceed two pages, and the font size should not be smaller than 10 pts. You will not be considered for this vacancy if your resume is illegible/unreadable. Additional information on resume requirements can be located under "
$50k-99.3k yearly 26d ago
Data Scientist - Clinical and Operational Analytics
Venesco LLC
Data engineer job in Dayton, OH
Requirements
Mandatory Qualifications:
• Bachelor's degree in a quantitative field (e.g., Computer Science, Applied Math).
• 3+ years of experience in predictive analytics.
• Proficiency in Python, NumPy, Pandas, Matplotlib, and Scikit-learn.
• Ability to explain and implement ML algorithms from scratch.
• Signed NDA and HIPAA training required upon start.
Desired Qualifications:
• Experience with dashboard development and pretrained language models.
• Experience with dimensionality reduction and deep learning libraries (TensorFlow, PyTorch).
• Familiarity with human biology and performance.
Key Tasks and Responsibilities:
• Develop and tune unsupervised tree-based clustering models.
• Implement decision trees, k-NN, and optimized list sorting algorithms.
• Generate and minimize distance matrices using vectorized code.
• Collaborate with software engineers and maintain HIPAA compliance.
$69k-95k yearly est. 60d+ ago
Data Scientist
Core4Ce Careers
Data engineer job in Dayton, OH
We are seeking a highly skilled Data Scientist / Machine Learning Engineer to develop advanced analytics and machine learning solutions that drive meaningful insights for our customers. In this role, you will design and test algorithms, build data-driven experiments, and collaborate closely with SMEs and developers to transform data into actionable intelligence. This position is ideal for someone who excels at both innovative research and practical implementation.
Key Responsibilities:
Algorithm Development: Develop machine learning, data mining, statistical, and graph-based algorithms to analyze complex data sets and uncover meaningful patterns.
Model Evaluation: Test, validate, and down-select algorithms to determine the best-performing models for customer requirements.
Experimental Design & Data Generation: Design experiments and creating synthetic or simulated data when training/example data sets are limited or unavailable.
Data Visualization & Reporting: Produce clear reports, dashboards, and visualizations that communicate data insights to customers and stakeholders in an intuitive manner.
Automation & SME Collaboration: Work with subject matter experts to convert manual analytic workflows into efficient, automated analytics solutions.
Cross-Functional Development: Collaborate with software developers to ensure algorithms are properly implemented, optimized, and integrated into production systems.
*This position is designed to be flexible, with responsibilities evolving to meet business needs and enable individual growth.
Required Qualifications:
Active TS-SCI security clearance with the ability to obtain a CI poly.
OPIR Experience
Modeling and Simulation Experience
Experience designing, training, and validating machine learning models and statistical algorithms.
Proficiency with Python, R, or similar languages used for analytics and model development.
Hands-on experience with data visualization tools (e.g., Tableau, Power BI, matplotlib, seaborn).
Strong understanding of experimental design and data generation strategies.
Ability to communicate complex analytic concepts to both technical and non-technical audiences.
Demonstrated ability to work collaboratively across multidisciplinary teams.
Degree in Mathematics/Statistics, Computer Science, or a relevant domain field.
MA/MS degree with 13+ years of relevant experience, OR
BA/BS degree with 15+ years of relevant experience in a discipline aligned with the position's responsibilities.
Why Work for Us?
Core4ce is a team of innovators, self-starters, and critical thinkers-driven by a shared mission to strengthen national security and advance warfighting outcomes.
We offer:
401(k) with 100% company match on the first 6% deferred, with immediate vesting
Comprehensive medical, dental, and vision coverage-employee portion paid 100% by Core4ce
Unlimited access to training and certifications, with no pre-set cap on eligible professional development
Tuition assistance for job-related degrees and courses
Paid parental leave, PTO that grows with tenure, and generous holiday schedules
Got a big idea? At Core4ce,
The Forge
gives every employee the chance to propose bold innovations and help bring them to life with internal backing.
Join us to build a career that matters-supported by a company that invests in you.
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex (including pregnancy), national origin, disability, veteran status, age, genetic information, or other legally protected status.
$69k-95k yearly est. 5d ago
Data Scientist (P3860)
84.51 4.3
Data engineer job in Cincinnati, OH
84.51° is a retail data science, insights and media company. We help The Kroger Co., consumer packaged goods companies, agencies, publishers and affiliates create more personalized and valuable experiences for shoppers across the path to purchase.
Powered by cutting-edge science, we utilize first-party retail data from more than 62 million U.S. households sourced through the Kroger Plus loyalty card program to fuel a more customer-centric journey using 84.51° Insights, 84.51° Loyalty Marketing and our retail media advertising solution, Kroger Precision Marketing.
Join us at 84.51°!
__________________________________________________________
84.51° is a retail data science, insights and media company. We help The Kroger Co., consumer packaged goods companies, agencies, publishers and affiliates create more personalized and valuable experiences for shoppers across the path to purchase.
Powered by cutting-edge science, we utilize first-party retail data from more than 62 million U.S. households sourced through the Kroger Plus loyalty card program to fuel a more customer-centric journey using 84.51° Insights, 84.51° Loyalty Marketing and our retail media advertising solution, Kroger Precision Marketing.
Kroger Precision Marketing (KPM), powered by 84.51°, is the retail media arm of Kroger, helping brands build stronger connections with customers through data-driven insights and media activation
.
By combining Kroger's loyalty data with the power of 84.51° data science and strategic media expertise, we provide brands with meaningful insights that fuel personalization and lead to impactful activations within the advertising ecosystem. Our comprehensive analytics bring customer stories and journeys to life-delivering measurable business impact for our brand partners while enhancing the customer experience.
As a Data Scientist, you will support the AI Platform within KPM. In this role, you'll be focused on foundational data science development to address strategic use cases across KPM's commercial portfolio. You will collaborate with fellow data scientists, product managers, engineers, and AI enablement teams to prototype new methodologies, explore emerging technologies, and build scalable solutions that deliver measurable business value.
To succeed in this role, you should be eager to learn and approach challenges with curiosity. You'll need experience or familiarity with tools like Python, PySpark, SQL, Power BI or other similar statistical software to develop analytical solutions. Additionally, you'll need familiarity with or a willingness to learn aspects of machine learning, statistical analysis, and modern AI frameworks. You should enjoy working in a team environment and be open to collaborating with data scientists, product managers, and business partners to support shared goals.
Responsibilities
Assist with prototyping solutions and contributing to rapid decision-making in uncertain contexts.
Collaborate closely with cross-functional teams, including product, engineering, and business stakeholders, throughout discovery, prototyping, and scaling.
Have strong technical acumen and a passion around bringing new methodologies and capabilities to life to solve business problems
Support project execution by managing tasks effectively to ensure timely, high-quality delivery.
Interpret results and contribute to analysis and reporting for stakeholders.
Contribute to building scalable, high-quality code and pipelines with mentorship from senior teammates.
Identify gaps and opportunities in existing science, measurement, and data solutions, bringing forward ideas for team discussion.
Continuously challenge and improve 84.51°'s analytical capabilities and products.
Share knowledge, contribute to best practices, and build efficiencies as part of a collaborative data science community.
Qualifications, Skills & Experience
A bachelor's degree or higher in mathematics, statistics, computer science, data science, economics, or related discipline.
Demonstrated proficiency with components of our technology stack, such as Azure, Python, Spark, GitHub, Power BI, and Snowflake.
Experience working with databases, analyzing data, and presenting findings through academic projects, internships, or entry-level roles.
Strong analytical, problem-solving, and communication skills
Package building and code optimization experience or a strong desire to learn.
Ability to approach unstructured problems and develop clear, practical solutions with support from senior team members.
Ability to balance technical learning with business priorities to drive value.
Good organizational and time management skills, with the ability to handle multiple assignments.
Ability to collaborate effectively with cross-functional teams, including business stakeholders, product managers, and engineers
Natural curiosity and willingness to experiment and learn from mistakes.
#LI-AB1
Pay Transparency and Benefits
The stated salary range represents the entire span applicable across all geographic markets from lowest to highest. Actual salary offers will be determined by multiple factors including but not limited to geographic location, relevant experience, knowledge, skills, other job-related qualifications, and alignment with market data and cost of labor. In addition to salary, this position is also eligible for variable compensation.
Below is a list of some of the benefits we offer our associates:
Health: Medical: with competitive plan designs and support for self-care, wellness and mental health. Dental: with in-network and out-of-network benefit. Vision: with in-network and out-of-network benefit.
Wealth: 401(k) with Roth option and matching contribution. Health Savings Account with matching contribution (requires participation in qualifying medical plan). AD&D and supplemental insurance options to help ensure additional protection for you.
Happiness: Paid time off with flexibility to meet your life needs, including 5 weeks of vacation time, 7 health and wellness days, 3 floating holidays, as well as 6 company-paid holidays per year. Paid leave for maternity, paternity and family care instances.
Pay Range$73,000-$125,350 USD
$73k-125.4k yearly Auto-Apply 6d ago
ETL Architect
Scadea Solutions
Data engineer job in Cincinnati, OH
Job title: ETL Architect DURATION 18 months YEARS OF EXPERIENCE 7-10 INTERVIEW TYPE Phone Screen to Hire REQUIRED SKILLS • Experience with Data Stage and ETL design Technical • Requirement gathering , converting business requirements to technical specs to profile
• Worked hands on in minimum 2 projects with data stage
• Understand the process of developing an etl design that support multiple datastage developers
• Be able to create an etl design framework and related specifications for use by etl developers
• Define standards and best practices of Data Stage etl to be followed by all data stage developers
• Understanding of Data Warehouse, Data marts concepts and implementation experience
• Be able to look at code produced to insure conformance with developed ETL framework and design for reuse
• Preferable experienced user level comptency in IBM's metadata product, datastage and Infosphere product line
• Be able to design etl for oracle or sql server or any db
• Good analytical skills and process design
• Insuring compliance to quality standards, and delivery timelines.
Qualifications
Bachelors
Additional Information
Required Skills:
Job Description:
Performs highly complex application programming/systems development and support Performs highly complex configuration of business rules and technical parameters of software products Review business requirements and develop application design documentation Build technical components (Maximo objects, TRM Rules, Java extensions, etc) based on detailed design.
Performs unit testing of components along with completing necessary documentation. Supports product test, user acceptance test, etc as a member of the fix-it team. Employs consistent measurement techniques Include testing in project plans and establish controls to require adherence to test plans Manages the interrelationships among various projects or work objectives
$86k-113k yearly est. 22h ago
AI Data Scientist
Medpace 4.5
Data engineer job in Cincinnati, OH
We are currently seeking an experienced data scientist to join our AI team who will support and lead data flow, advanced analytical needs and AI tools across Medpace. The AI team utilizes analytical principles and techniques to identify, collate and analyze many data sources and works with teams across Medpace to support efficiency and business gains for pharmaceutical development. The AI Data Scientist will support various projects across the company to bring data sources together in a consistent manner, work with the business to identify the value of AI, identify appropriate solutions and work with IT to ensure they are developed and built into the relevant systems. The team is seeking an experienced candidate to contribute new skills to our team, support team growth and foster AI development.
The AI Team is a highly collaborative team with members in both the Cincinnati and London offices. This team supports many teams across the business including clinical operations, medical, labs, business development and business operations. The AI Team also works side-by-side with dataengineering, business analytics and software engineering to architecture innovative data storage and access solutions for optimal data utilization strategies. If you are an individual with experience in informatics, data science, or computer science, please review the following career opportunity.
Responsibilities
* Explore and work with different data sources to collate into knowledge;
* Work with different business teams across the company with a variety of different business needs to identify potential areas that AI can support;
* Manage the process of working through AI potentials from discovery research to PoC to production with the business teams and supporting tasks for IT developers;
* Try out different AI tools to substantiate the potential of its use with the business team;
* Translate results into compelling visualizations which illustrate the overall benefits of the use of AI and identify with the business team the overall value of its use;
* Develop and map database architecture of methodological and clinical data systems;
* Convert business tasks into meaningful developer Jira tasks for sprints;
* Support departmental process improvement initiatives that can include AI; and
* Participate in training and development of more junior team members.
Qualifications
* Master's degree or higher in informatics, computer science/engineering, health information, statistics, or related field required;
* 2 or more years of experience as a Data Scientist or closely related;
* Experience applying machine learning to pharmaceutical or clinical data (or translatable artificial intelligence [ai] techniques from other industries);
* Advanced computer programming skills (preferred language: Python);
* Analytical thinker with great attention to detail;
* Ability to prioritize multiple projects and tasks within tight timelines; and
* Excellent written and verbal communication skills.
Medpace Overview
Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries.
Why Medpace?
People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today.
The work we've done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future.
Cincinnati Perks
* Cincinnati Campus Overview
* Flexible work environment
* Competitive PTO packages, starting at 20+ days
* Competitive compensation and benefits package
* Company-sponsored employee appreciation events
* Employee health and wellness initiatives
* Community involvement with local nonprofit organizations
* Discounts on local sports games, fitness gyms and attractions
* Modern, ecofriendly campus with an on-site fitness center
* Structured career paths with opportunities for professional growth
* Discounted tuition for UC online programs
Awards
* Named a Top Workplace in 2024 by The Cincinnati Enquirer
* Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024
* Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility
What to Expect Next
A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps.
$69k-98k yearly est. Auto-Apply 8d ago
Data Engineer
Total Quality Logistics, Inc. 4.0
Data engineer job in Cincinnati, OH
Country USA State Ohio City Cincinnati Descriptions & requirements About the role: As a DataEngineer, with TQL you will be supporting the FP&A department by developing scalable reporting solutions in Microsoft Fabric. This role will focus on migrating data from on-premises systems to the cloud, building and optimizing SQL views and pipelines, and creating governed Power BI datasets and semantic models.
What's in it for you:
* $85,000-$125,000 base salary + performance bonuses
* Advancement opportunities with aggressive and structured career paths
* A culture of continuous education and technical training with reimbursements available
* Comprehensive benefits package
* Health, dental and vision coverage
* 401(k) with company match
* Perks including employee discounts, financial wellness planning, tuition reimbursement and more
What you'll be doing:
* Migrate FP&A datasets from on-premises to Microsoft Fabric/Lakehouse
* Build and maintain SQL pipelines, transformations, and views that support reporting needs
* Ensure performance, scalability, and reliability through automation, monitoring, and CI/CD best practices
* Design, publish, and manage Power BI certified datasets, semantic models, and reports/dashboards
* Apply best practices in DAX, modeling, and governance to enable accurate, self-service reporting
* Partner with Finance stakeholders to translate reporting requirements into technical deliverables
* Implement processes to ensure accuracy, consistency, and reconciliation across financial and operational systems
* Maintain documentation of data models, business logic, and reporting standards
* Troubleshoot and resolve issues impacting reporting accuracy or performance
* Collaborate with Data Governance and Quality teams to align with enterprise standards and metadata frameworks
What you need:
* Bachelor's degree in Computer Science, Information Systems, DataEngineering, or related field
* 3+ years of experience in BI/dataengineering or analytics engineering
* Advanced SQL skills with proven experience in building and optimizing large-scale datasets
* Strong Power BI expertise (datasets, DAX, performance tuning, semantic models)
* Hands-on experience with Microsoft Fabric and Lakehouse/cloud data platforms preferred
* Knowledge of financial reporting concepts and ability to work with FP&A stakeholders p
* Strong problem-solving skills and ability to bridge Finance and IT needs
Where you'll be: 4289 Ivy Pointe Boulevard, Cincinnati, Ohio 45245
Employment visa sponsorship is unavailable for this position. Applicants requiring employment visa sponsorship now or in the future (e.g., F-1 STEM OPT, H-1B, TN, J1 etc.) will not be considered.
About Us
Total Quality Logistics (TQL) is one of the largest freight brokerage firms in the nation. TQL connects customers with truckload freight that needs to be moved with quality carriers who have the capacity to move it.
As a company that operates 24/7/365, TQL manages work-life balance with sales support teams that assist with accounting, and after hours calls and specific needs. At TQL, the opportunities are endless which means that there is room for career advancement and the ability to write your own paycheck.
What's your worth? Our open and transparent communication from management creates a successful work environment and custom career path for our employees. TQL is an industry-leader in the logistics industry with unlimited potential. Be a part of something big.
Total Quality Logistics is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, age, national origin, genetic information, disability or protected veteran status.
If you are unable to apply online due to a disability, contact recruiting at ******************
*
$85k-125k yearly 35d ago
Data Engineer
Tata Consulting Services 4.3
Data engineer job in Blue Ash, OH
* Proven experience as a Software Developer, with a strong focus on building scalable and efficient Python applications. * Experience in developing Spark Structured Streaming applications is highly desirable. * Minimum of 7+ years of professional software development experience.
* Strong analytical and problem-solving skills, with the ability to debug and optimize Spark jobs running on Databricks.
* Ability to work closely with cross-functional teams to deliver high-quality streaming solutions.
Technical Skills:
* Strong expertise in Python, PySpark, and Spark Structured Streaming.
* Experience with Databricks and Azure.
* Familiarity with Delta Lake and Terraform scripting.
* Proficiency in working with varied data file formats (Avro, JSON, CSV) for ingestion and transformation.
Software Development:
* Proficiency in Object-Oriented Programming (OOP) concepts and software design principles.
* Ability to write clean, maintainable, and scalable Python code.GitHub Actions:
* Experience in setting up and managing CI/CD pipelines using GitHub Actions to ensure smooth and automated deployment processes.
Agile Methodology:
* Experience working in an Agile/Scrum environment, with a focus on iterative development, continuous feedback, and delivery.
Nice to Haves:
* Python Unit Testing.
* Unity Catalog.
* Databricks Asset Bundles.
* Unit Testing/Mocking
TCS Employee Benefits Summary:
* Discretionary Annual Incentive.
* Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans.
* Family Support: Maternal & Parental Leaves.
* Insurance Options: Auto & Home Insurance, Identity Theft Protection.
* Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement.
* Time Off: Vacation, Time Off, Sick Leave & Holidays.
* Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
#LI-RJ2
Salary Range-$100,000-$120,000 a year
$100k-120k yearly 5d ago
Data Engineer
Amend Consulting 4.0
Data engineer job in Cincinnati, OH
About AMEND: AMEND is a management consulting firm based in Cincinnati, OH with areas of focus in operations, analytics, and technology. We are focused on strengthening the people, processes, and systems in organizations to generate a holistic transformation. Our three-tiered approach provides a distinct competitive edge and allows us to build strong relationships and create customized solutions for every client. This is an incredible time to step into a growing team where everyone is aligned to a common goal to change lives, transform businesses, and make a positive impact on anything we touch.
Overview:
The DataEngineer consultant role is an incredibly exciting position in the fastest growing segment of AMEND. You will be working to solve real-world problems by designing cutting edge analytic solutions while surrounded by a team of world class talent. You will be entering an environment of explosive growth with ample opportunity for development. We are looking for individuals who can go into a client and optimize (or re-design) companies data architecture, who are the combination of a change agent, technical leader and passionate about transforming companies for the better. We need someone who is a problem solver, a critical thinker, and is always wanting to go after new things; you'll never be doing the same thing twice!
Job Tasks:
Create and maintain optimal data pipeline architecture
Assemble large, complex data sets that meet functional / non-functional business requirements
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs
Define project requirements by identifying project milestones, phases, and deliverables
Execute project plan, report progress, identify and resolve problems, and recommend further actions
Delegate tasks to appropriate resources as project requirements dictate
Design, develop, and deliver audience training and adoption methods and materials
Qualifications:
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Databricks and DBT experience is a plus
Experience building and optimizing data pipelines, architectures, and data sets
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
Strong analytic skills related to working with structured and unstructured datasets
Build processes supporting data transformation, data structures, metadata, dependency, and workload management
A successful history of manipulating, processing, and extracting value from large, disconnected datasets
Ability to interface with multiple other business functions (internally and externally)
Desire to build analytical competencies in others within the business
Curiosity to ask questions and challenge the status quo
Creativity to devise out-of-the-box solutions
Ability to travel as needed to meet client requirements
What's in it for you?
Competitive pay and bonus
Travel incentive bonus structure
Flexible time off
Investment in your growth and development
Full health, vision, dental, and life benefits
Paid parental leave
3:1 charity match
All this to say - we are looking for talented people who are excited to make an impact on our clients. If this job description isn't a perfect match for your skillset, but you are talented, eager to learn, and passionate about our work, please apply! Our recruiting process is centered around you as an individual and finding the best place for you to thrive at AMEND, whether it be with the specific title on this posting or something different. One recruiting conversation with us has the potential to open you up to our entire network of opportunities, so why not give it a shot? We're looking forward to connecting with you.
*Applicants must be authorized to work for any employer in the U.S. We are unable to sponsor or take over sponsorship of employment Visa at this time.*
$70k-91k yearly est. Auto-Apply 60d+ ago
Senior Data Engineer
Apidel Technologies 4.1
Data engineer job in Blue Ash, OH
Job Description
The Engineer is responsible for staying on track with key milestones in Customer Platform / Customer Data Acceleration, work will be on the new Customer Platform Analytics system in Databricks. The Engineer has overall responsibility in the technical design process. Leads and participates in the application technical design process and completes estimates and work plans for design, development, implementation, and rollout tasks. The Engineer also communicates with the appropriate teams to ensure that assignments are delivered with the highest of quality and in accordance to standards. The Engineer strives to continuously improve the software delivery processes and practices. Role model and demonstrate the companys core values of respect, honesty, integrity, diversity, inclusion and safety of others.
Current tools and technologies include:
Databricks and Netezza
Key Responsibilities
Lead and participate in the design and implementation of large and/or architecturally significant applications.
Champion company standards and best practices. Work to continuously improve software delivery processes and practices.
Build partnerships across the application, business and infrastructure teams.
Setting up new customer data platforms from Netezza to Databricks
Complete estimates and work plans independently as appropriate for design, development, implementation and rollout tasks.
Communicate with the appropriate teams to ensure that assignments are managed appropriately and that completed assignments are of the highest quality.
Support and maintain applications utilizing required tools and technologies.
May direct the day-to-day work activities of other team members.
Must be able to perform the essential functions of this position with or without reasonable accommodation.
Work quickly with the team to implement new platform.
Be onsite with development team when necessary.
Behaviors/Skills:
Puts the Customer First - Anticipates customer needs, champions for the customer, acts with customers in mind, exceeds customers expectations, gains customers trust and respect.
Communicates effectively and candidly - Communicates clearly and directly, approachable, relates well to others, engages people and helps them understand change, provides and seeks feedback, articulates clearly, actively listens.
Achieves results through teamwork Is open to diverse ideas, works inclusively and collaboratively, holds self and others accountable, involves others to accomplish individual and team goals
Note to Vendors
Length of Contract 9 months
Top skills Databricks, Netezza
Soft Skills Needed collaborating well with others, working in a team dynamic
Project person will be supporting - staying on track with key milestones in Customer Platform / Customer Data Acceleration, Work will be on the new Customer Platform Analytics system in Databricks that will replace Netezza
Team details ie. size, dynamics, locations most of the team is located in Cincinnati, working onsite at the BTD
Work Location (in office, hybrid, remote) Onsite at BTD when necessary, approximately 2-3 days a week
Is travel required - No
Max Rate if applicable best market rate
Required Working Hours 8-5 est
Interview process and when will it start Starting with one interview, process may change
Prescreening Details standard questions. Scores will carry over.
When do you want this person to start Looking to hire quickly, the team is looking to move fast.
$79k-114k yearly est. 4d ago
Data Engineer (Technology Solutions Analyst IV) (Vacancy)
City of Columbus, Oh 4.0
Data engineer job in Franklin, OH
Definition * Data Integration & ETL Development: Design, build, and maintain ETL/ELT pipelines using tools like Talend, Informatica, ADF, SSIS, or similar. * SQL & Data Querying: Strong SQL skills for data validation, profiling, troubleshooting, and optimizing large datasets.
* Database Systems: Experience with relational databases (SQL Server, PostgreSQL, Oracle), including schemas, normalization, indexing, and performance.
* API & File-Based Integration: Knowledge of REST/SOAP APIs, JSON, XML, flat files, and batch transfers; able to integrate with external systems securely.
* Data Mapping & Transformation Logic: Develop source-to-target mappings, define transformation rules, and implement business logic in pipelines.
* Data Modeling: Familiarity with dimensional, relational, and operational models to support analytics and operations.
* Data Quality & Validation: Implement validation rules, quality checks, error handling, auditing, and reconciliation in data pipelines.
* Programming & Scripting: Proficiency in Python, Java, or Scala to write, optimize, and debug code for data processing and ETL workflows.
* Automation & Scheduling: Use job scheduling and orchestration tools like Airflow, Control-M, Cron, or platform-native schedulers.
* Cloud & Storage Concepts: Understanding of cloud storage, compute, and integration on Azure, AWS, or GCP (optional but valuable).
* Version Control & SDLC: Familiarity with Git and structured development processes, including code reviews, deployments, and documentation.
* Technical Requirement Gathering: Translate business requirements into technical specifications for data pipelines and integrations.
* Testing & Troubleshooting: Validate ETL processes, analyze outputs, resolve issues, and support repeatable UAT procedures.
* Documentation: Produce design documents, data flows, dictionaries, integration specs, and operational procedures.
* Analytical Problem-Solving: Analyze data flows and system behaviors to identify root causes and recommend fixes.
* Performance Optimization: Optimize queries, transformations, pipeline logic, and system configurations for throughput and reliability.
* Collaboration & Communication: Work with BAs, analysts, SMEs, application leads, and engineers to deliver accurate integrations.
* Tooling & Productivity: Use Smartsheet, Jira, SharePoint, or similar tools for task tracking and documentation.
Under direction, is responsible for researching, designing, developing, and improving enterprise data and/or application solutions; performs related duties as required.
Examples of Work
(Any one position may not include all of the duties listed, nor do the examples cover all of the duties that may be performed.)
Develops data and/or applications solutions, which may include utilization of supportive software, data management platforms, database management systems, server applications, and/or web-based development systems;
Confers with departmental or divisional personnel to analyze current operational procedures, document potential problems, and identify specific data input and output requirements for software system development;
Analyzes requirements of data and/or application solutions, such as licensing requirements, data requirements, peripheral equipment, maintenance support, server requirements, access on mobile devices, or other system interfaces and requirements to determine feasibility of system designs within time and budget constraints;
Reviews the outline of the new or revised business system process(es) and makes recommendations for improvements by defining the hardware and software requirements of the proposed system(s) or process(es); evaluates integration requirements for data and applications and supportive databases; develops solutions for integration and/or extraction of data for cross-platform dependencies among application systems; prepares cost estimates and project timelines for implementation;
Consults with technical staff to evaluate business processes, needs, expectations, and functional requirements and translates into technical requirements; designs and proposes solutions to streamline or enhance business functions/processes;
Develops solutions for citywide data integration and management; creates synthesized data models, transformations and visualizations for deployment of analytics to meet the business or operational needs of the City;
Researches third-party software systems for feasibility of design and compatibility and adaptability with existing architecture and business processes; reviews proposed hardware and software solutions and recommends selection to management for approval;
Formulates, designs, configures, and/or modifies software systems using scientific analysis to predict and measure outcome and consequences of design;
Develops and directs software system testing procedures, programming, and documentation;
Serves as senior consultant for database design, implementation, and administration, including security, backup, recovery, and maintenance; utilizes administrative rights within software applications to mass import updates to an application system;
Advises departmental or divisional personnel in their technology needs with regard to data corruption, security issues, computer viruses, and hardware and software redundancy;
Consults with staff members to evaluate interface between hardware and software and operational and performance requirements of software system; analyzes system growth and evaluates processing efficiency;
Plans and prepares technical reports, memoranda, and instructional manuals as documentation of system development;
Makes recommendations to management regarding the planning, development, and coordination of software system development projects;
Performs engineering cost/benefit analysis to verify potential effectiveness of new products; conducts technical research for building new designs, developing business cases, selling ideas to management, and gaining commitment for new system enhancements;
Mentors business systems analysts and senior programmer analysts in their work and individual projects as requested by management;
Participates in appropriate professional activities to stay abreast of existing and emerging technologies.
Minimum Qualifications
Possession of a bachelor's degree and four (4) years of experience in systems analysis, database management, applications development, or software design. Substitution(s): Valid possession of one (1) of the following certifications may be substituted for the required education: Microsoft Certified Solutions Developer (all tracks), Microsoft Certified Solutions Expert (Data Platform or Business Intelligence), or Geographic Information Systems Professional (GISP). Additional experience as specified above may substitute for the educational requirement on a year-for-year basis. Possession of a master's degree may be substituted for one (1) year of the required experience.
Test/Job Contact Information
Recruitment #: 25-0585-V7
Employment Type: Full-Time (Regular)
Should you have questions regarding this vacancy, please contact:
Kimberly Hetterscheidt
Department of Technology
Division of Information Services
1111 E. Broad St.
Columbus, Ohio 43205
P: **************
E:***************************
The City of Columbus is an Equal Opportunity Employer
$58k-72k yearly est. 21d ago
Senior Data Engineer
General Electric Credit Union 4.8
Data engineer job in Cincinnati, OH
General Electric Credit Union is a not-for-profit, member-owned full service financial institution headquartered in Cincinnati with branches in Ohio and Kentucky. At GECU, we pride ourselves on maintaining quality service, being an employee-friendly workplace, and developing our team members while teaching you the skills to lead you to career advancement opportunities.
Overview: The Senior DataEngineer will play a key role in developing and optimizing GECU's data infrastructure to support the organization's data-driven initiatives. The Senior DataEngineer will be designing, building, and maintaining scalable data pipelines and systems, working with the data and development team Essential Responsibilities:
Design, implement, and maintain robust, scalable, and high-performance data pipelines and ETL processes to collect, process, and store large volumes of structured and unstructured data.
Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions.
Develop and maintain data warehouse and data lake solutions, ensuring data quality, integrity, and reliability.
Optimize data pipelines and ETL processes for performance, efficiency, and cost-effectiveness, utilizing best practices and technologies.
Implement data governance and security measures to ensure compliance with regulatory requirements and data privacy standards.
Troubleshoot and resolve issues related to data processing, data quality, and system performance in a timely manner.
Evaluate and recommend new technologies, tools, and frameworks to enhance the organization's data infrastructure and capabilities.
Document technical specifications, data lineage, and system architecture to facilitate knowledge sharing and collaboration.
Collaborate with other key data employees to maintain and publish data definitions and data catalogue.
Stay up to date with industry trends and emerging technologies in dataengineering and analytics.
Education and Experience:
High school diploma, or GED required; Bachelor's degree in Computer Science, Engineering, or related field; Master's degree is a plus.
Minimum 6 years' experience in DataEngineering; working with data warehousing concepts, database technologies (e.g., SQL, NoSQL), and distributed computing architectures.
Experience with Snowflake Data Warehouse preferred
Knowledge, Skills, and Abilities:
Strong programming skills in languages such as Python, Java, Scala, or SQL, with experience in data manipulation, transformation, and analysis.
Knowledge of cloud platforms such as AWS, Azure, or Google Cloud Platform.
Extensive knowledge of data modeling, schema design, and optimization techniques for relational and non-relational databases.
Excellent problem-solving and troubleshooting skills, with the ability to diagnose and resolve complex technical issues.
Strong communication and collaboration skills, with the ability to work effectively in a team environment and interact with stakeholders at all levels.
Ability to perform independently and competently to accomplish necessary deliverables accurately and on time.
Ability to assist and mentor Junior DataEngineers
At GECU, we want to support your wellbeing by offering a wide range of benefits:
Health, Dental and Vision insurance
Life and Disability insurance options
Paid Time Off starts accruing once hired and take your birthday off -paid
401k Retirement plan with up to a 10% match of your base gross compensation
Tuition reimbursement opportunities & professional development
Volunteer opportunities -and earn additional PTO hours!
On-site clinics for Vaccines and Mammograms
And many more!
Come join GECU as we are a curated culture of respect, understanding, and mutual recognition. We believe forming bonds and connecting with each other only stands to strengthen the service we provide to our members in our mission of improving the Quality of Financial lives!
General Electric Credit Union is an Equal Opportunity Employer
$77k-101k yearly est. 60d+ ago
Data Scientist - Clinical and Operational Analytics
Venesco LLC
Data engineer job in Dayton, OH
Job DescriptionDescription:
Develop and deploy machine learning models to support clinical and operational decision-making. Work with large datasets to extract insights and support predictive analytics for human performance and health.
Requirements:
Mandatory Qualifications:
• Bachelor's degree in a quantitative field (e.g., Computer Science, Applied Math).
• 3+ years of experience in predictive analytics.
• Proficiency in Python, NumPy, Pandas, Matplotlib, and Scikit-learn.
• Ability to explain and implement ML algorithms from scratch.
• Signed NDA and HIPAA training required upon start.
Desired Qualifications:
• Experience with dashboard development and pretrained language models.
• Experience with dimensionality reduction and deep learning libraries (TensorFlow, PyTorch).
• Familiarity with human biology and performance.
Key Tasks and Responsibilities:
• Develop and tune unsupervised tree-based clustering models.
• Implement decision trees, k-NN, and optimized list sorting algorithms.
• Generate and minimize distance matrices using vectorized code.
• Collaborate with software engineers and maintain HIPAA compliance.
$69k-95k yearly est. 7d ago
ETL Architect
Scadea Solutions
Data engineer job in Cincinnati, OH
Job title: ETL Architect
DURATION 18 months
YEARS OF EXPERIENCE 7-10
INTERVIEW TYPE Phone Screen to Hire
REQUIRED SKILLS
• Experience with Data Stage and ETL design
Technical
• Requirement gathering , converting business requirements to technical specs to profile
• Worked hands on in minimum 2 projects with data stage
• Understand the process of developing an etl design that support multiple datastage developers
• Be able to create an etl design framework and related specifications for use by etl developers
• Define standards and best practices of Data Stage etl to be followed by all data stage developers
• Understanding of Data Warehouse, Data marts concepts and implementation experience
• Be able to look at code produced to insure conformance with developed ETL framework and design for reuse
• Preferable experienced user level comptency in IBM's metadata product, datastage and Infosphere product line
• Be able to design etl for oracle or sql server or any db
• Good analytical skills and process design
• Insuring compliance to quality standards, and delivery timelines.
Qualifications
Bachelors
Additional Information
Required Skills:
Job Description:
Performs highly complex application programming/systems development and support Performs highly complex configuration of business rules and technical parameters of software products Review business requirements and develop application design documentation Build technical components (Maximo objects, TRM Rules, Java extensions, etc) based on detailed design.
Performs unit testing of components along with completing necessary documentation. Supports product test, user acceptance test, etc as a member of the fix-it team. Employs consistent measurement techniques Include testing in project plans and establish controls to require adherence to test plans Manages the interrelationships among various projects or work objectives
$86k-113k yearly est. 60d+ ago
Junior Data Scientist
Medpace 4.5
Data engineer job in Cincinnati, OH
The Medpace Analytics and Business Intelligence team is growing rapidly and is focused on building a data driven culture across the enterprise. The BI team uses data and insights to drive increased strategic and operational efficiencies across the organization. As a Junior Data Scientist, you will hold a highly visible analytical role that requires interaction and partnership with leadership across the Medpace organization.
What's in this for you?
* Work in a collaborative, fast paced, entrepreneurial, and innovative workplace;
* Gain experience and exposure to advanced BI concepts from visualization to data warehousing;
* Grow business knowledge by working with leadership across all aspects of Medpace's business.
Responsibilities
* Data Collection & Cleaning: Gather, clean, and preprocess large, raw datasets;
* Analysis & Modeling: Perform statistical analysis, build & validate machine learning models, and test hypotheses (e.g., A/B testing);
* Algorithm Development: Create algorithms to manage and interpret information, often automating processes;
* Insight Generation: Discover trends, patterns, and insights to inform business strategy;
* Visualization & Communication: Present complex findings visually (dashboards, charts) and verbally to technical and non-technical teams;
* Collaboration: Work with engineering, product, and business teams to implement solutions;
* Model Monitoring: Deploy and maintain models, iterating for continuous improvement.
Qualifications
* Bachelor's Degree in Business, Life Science, Computer Science, or Related Degree;
* 0-3 years of experience in business intelligence or analytics - Python, R, SQL heavily preferred
* Strong analytical and communication skills;
* Excellent organization skills and the ability to multitask while efficiently completing high quality work.
Medpace Overview
Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries.
Why Medpace?
People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today.
The work we've done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future.
Cincinnati Perks
* Cincinnati Campus Overview
* Flexible work environment
* Competitive PTO packages, starting at 20+ days
* Competitive compensation and benefits package
* Company-sponsored employee appreciation events
* Employee health and wellness initiatives
* Community involvement with local nonprofit organizations
* Discounts on local sports games, fitness gyms and attractions
* Modern, ecofriendly campus with an on-site fitness center
* Structured career paths with opportunities for professional growth
* Discounted tuition for UC online programs
Awards
* Named a Top Workplace in 2024 by The Cincinnati Enquirer
* Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024
* Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility
What to Expect Next
A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps.
$69k-98k yearly est. Auto-Apply 11d ago
Senior Data Engineer (P358)
84.51 4.3
Data engineer job in Cincinnati, OH
84.51° is a retail data science, insights and media company. We help The Kroger Co., consumer packaged goods companies, agencies, publishers and affiliates create more personalized and valuable experiences for shoppers across the path to purchase.
Powered by cutting-edge science, we utilize first-party retail data from more than 62 million U.S. households sourced through the Kroger Plus loyalty card program to fuel a more customer-centric journey using 84.51° Insights, 84.51° Loyalty Marketing and our retail media advertising solution, Kroger Precision Marketing.
Join us at 84.51°!
__________________________________________________________
PLEASE NOTE:
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United Stated and with the Kroger Family of Companies (i.e. H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Senior DataEngineer, AI Enablement (P358) Summary
As a Senior DataEngineer on our AI Enablement team, you will cultivate strategies and solutions to ingest, store and distribute our big data. This role is on our enablement team that builds solutions for monitoring, registering, and tracking our machine learning and AI solutions across 84.51˚ and develops monitoring and observability pipelines for our internal AI tooling. Our engineers use PySpark, Python, SQL, GitHub actions, and Databricks/Azure to develop scalable data solutions.
Responsibilities
Take ownership of features and drive them to completion through all phases of the entire 84.51° SDLC. This includes external facing and internal applications as well as process improvement activities such as:
Lead design of Python and PySpark based solutions
Perform development of cloud based (Azure) ETL solutions
Build and configure cloud infrastructure for all stages of the data development lifecycle
Execute unit and integration testing
Develop robust data QA processes
Collaborate with senior resources to ensure consistent development practices
Provide mentoring to junior resources
Build visualizations in Databricks Apps, Databricks Dashboards and PowerBI
Bring new perspectives to problems and be driven to improve yourself and the way things are done
Qualifications, Skills, and Experience
Bachelor's degree in Computer Science, Management Information Systems, Mathematics, Business Analytics or another technically strong program.
3+ years proven ability of professional dataengineering experience
Strong understanding of Agile Principles (Scrum)
3+ years proven ability of developing with Python and PySpark
Full understanding of ETL concepts and data warehousing concepts
Experience with CI/CD frameworks (GitHub Actions a plus)
Experience with visualization techniques and tools like Databricks dashboards or PowerBI a plus
Languages/Tech stack:
Python
PySpark
Terraform
Databricks
GitHub Actions
Azure
AKS experience a plus
Dashboard experience a plus
WebApp experience a plus
#LI-SSS
Pay Transparency and Benefits
The stated salary range represents the entire span applicable across all geographic markets from lowest to highest. Actual salary offers will be determined by multiple factors including but not limited to geographic location, relevant experience, knowledge, skills, other job-related qualifications, and alignment with market data and cost of labor. In addition to salary, this position is also eligible for variable compensation.
Below is a list of some of the benefits we offer our associates:
Health: Medical: with competitive plan designs and support for self-care, wellness and mental health. Dental: with in-network and out-of-network benefit. Vision: with in-network and out-of-network benefit.
Wealth: 401(k) with Roth option and matching contribution. Health Savings Account with matching contribution (requires participation in qualifying medical plan). AD&D and supplemental insurance options to help ensure additional protection for you.
Happiness: Paid time off with flexibility to meet your life needs, including 5 weeks of vacation time, 7 health and wellness days, 3 floating holidays, as well as 6 company-paid holidays per year. Paid leave for maternity, paternity and family care instances.
Pay Range$97,000-$166,750 USD
$97k-166.8k yearly Auto-Apply 4d ago
Data Engineer
Tata Consulting Services 4.3
Data engineer job in Cincinnati, OH
* 2+ years of proven professional data development experience * 2+ years developing with SQL * 4+ years Python Development * 3+ years Java, Spring Framework development * Object Oriented Programming * 3+ years Distributed Data Processing (PySpark, Snowpark)
* Proficient CI/CD practices
* Automated data pipeline orchestration
* Data observability - Logging, Monitoring, and Alerting
* Databricks and/or Snowflake
* API development
* Data quality checks
* Cloud Technologies (Azure preferred)
Roles & Responsibilities:
* Develop distributed data processing data pipeline solutions
* Orchestrate multi-step data transformation pipelines
* Perform unit, integration, and regression testing on packaged code
* Build transformation logic and code in an Object Oriented Programming style
* Enhance CI/CD pipelines in the path to production
* Create data quality checks for ingested and post processed data
* Ensure data observability via alerting and monitoring of automated pipeline solutions
* Maintain and enhance existing applications
* Build cloud resources via infrastructure as code
* Provide mentoring to junior team members
* Participate in retrospective reviews
* Participate in the estimation process for new work and releases
* Bring new perspectives to problems
* Be driven to improve yourself and the way things are done
TCS Employee Benefits Summary:
* Discretionary Annual Incentive.
* Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans.
* Family Support: Maternal & Parental Leaves.
* Insurance Options: Auto & Home Insurance, Identity Theft Protection.
* Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement.
* Time Off: Vacation, Time Off, Sick Leave & Holidays.
* Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
#LI-RJ2
Salary Range-$100,000-$120,000 a year
$100k-120k yearly 5d ago
Junior Data Engineer
Medpace 4.5
Data engineer job in Cincinnati, OH
Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Junior DataEngineer to join our Information Technology team. This position will work on a team to accomplish tasks and projects that are instrumental to the company's success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you.
Responsibilities
* Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL);
* Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS);
* Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI;
* Collect, analyze and document user requirements;
* Participate in software validation process through development, review, and/or execution of test plan/cases/scripts;
* Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance;
* Communicate with team members regarding projects, development, tools, and procedures; and
* Provide end-user support including setup, installation, and maintenance for applications
Qualifications
* Bachelor's Degree in Computer Science, Data Science, or a related field;
* Internship experience in Data or Software Engineering;
* Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs;
* Solid ETL development, reporting knowledge based off intricate understanding of business process and measures;
* Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred;
* Knowledge of Python is preferred;
* Knowledge of REST API;
* Basic knowledge of SQL Server databases is required;
* Knowledge of C#, Azure development is a bonus; and
* Excellent analytical, written and oral communication skills.
Medpace Overview
Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries.
Why Medpace?
People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today.
The work we've done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future.
Cincinnati Perks
* Cincinnati Campus Overview
* Flexible work environment
* Competitive PTO packages, starting at 20+ days
* Competitive compensation and benefits package
* Company-sponsored employee appreciation events
* Employee health and wellness initiatives
* Community involvement with local nonprofit organizations
* Discounts on local sports games, fitness gyms and attractions
* Modern, ecofriendly campus with an on-site fitness center
* Structured career paths with opportunities for professional growth
* Discounted tuition for UC online programs
Awards
* Named a Top Workplace in 2024 by The Cincinnati Enquirer
* Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024
* Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility
What to Expect Next
A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps.
$80k-111k yearly est. Auto-Apply 7d ago
Go Anywhere SFTP Data Engineer
Tata Consulting Services 4.3
Data engineer job in Cincinnati, OH
* Maintain robust data pipelines for ingesting and processing data from various sources, including SFTP servers. * Implement and manage automated SFTP data transfers, ensuring data security, integrity, and timely delivery. * Configure and troubleshoot SFTP connections, including handling authentication, key management, and directory structures.
* Develop and maintain scripts or tools for automating SFTP-related tasks, such as file monitoring, error handling, and data validation.
* Collaborate with external teams and vendors to establish and maintain secure SFTP connections for data exchange.
* Ensure compliance with data security and governance policies related to SFTP transfers.
* Monitor and optimize SFTP performance, addressing any bottlenecks or issues.
* Document SFTP integration processes, configurations, and best practices.
* Responsible for providing monthly SOC controls.
* Experience with Solimar software.
* Responsible for period software updates and patching.
* Manage open incidents.
* Responsible for after-hours and weekends on-call duties
* Minimum (3-5) years related work experience
* Experience with Microsoft Software and associated server tools.
* Experience with GoAnywhere managed file transfer (MFT) solution.
* Experience with WinSCP
* Experience with Azure Cloud
* Proven experience in dataengineering, with a strong emphasis on data ingestion and pipeline development.
* Demonstrated expertise in working with SFTP for data transfer and integration.
* Proficiency in scripting languages (e.g., Python, Shell) for automating SFTP tasks.
* Familiarity with various SFTP clients, servers, and related security protocols.
* Understanding of data security best practices, including encryption and access control for SFTP.
* Experience with cloud platforms (e.g., AWS, Azure, GCP) and their SFTP integration capabilities is a plus.
* Strong problem-solving and troubleshooting skills related to data transfer and integration issues.
Salary Range- $80,000-$85,000 a year
#LI-SP3
#LI-VX1
The average data engineer in Dayton, OH earns between $66,000 and $116,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Dayton, OH
$87,000
What are the biggest employers of Data Engineers in Dayton, OH?
The biggest employers of Data Engineers in Dayton, OH are: