The application window is expected to close on: 01/31/2026 **Job posting may be removed earlier if the position is filled or if a sufficient number of applications are received** . The application window is expected to close on: 12/25/25 Job posting may be removed earlier if the position is filled or if a sufficient number of applications are received.
Job location preference: Remote within USA, RTP-NC and Austin-TX
Meet the Team
Join the Cisco IT Data team, where innovation, automation, and reliability drive world-class business outcomes. Our team delivers scalable, secure, and high-performance platforms supporting Cisco's global data operations. We value a culture of continuous improvement, collaboration, and technical excellence, empowering team members to experiment and drive operational transformation.
Your Impact
As a Data Operations (DevOps) Engineer, you will play a meaningful role in building, automating, and optimizing the infrastructure and processes that support the Corporate Functions - Enterprise Data Warehouse. Your expertise will ensure the reliability, scalability, and security of data platforms and pipelines across cloud and on-premise environments. You'll collaborate closely with dataengineers, software engineers, architects, and business partners to create robust solutions that accelerate data-driven decision-making at Cisco.
Key Responsibilities
* Automate deployment, monitoring, and management of data platforms and pipelines using industry-standard DevOps tools and standard processes.
* Design, implement, and maintain CI/CD pipelines for ETL, analytics, and data applications (e.g., Informatica, DBT, Airflow, Python, Java).
* Ensure high availability, performance, and security of data systems in cloud (Snowflake, Google BigQuery, AWS/GCP/Azure) and hybrid environments.
* Lead infrastructure as code (Terraform, CloudFormation, or similar) to provision and scale resources efficiently.
* Implement observability and data quality monitoring using modern tools (e.g., Monte Carlo, Prometheus, Grafana, ELK).
* Solve and resolve issues in production data & workflows, collaborating with engineering and analytics teams for root cause analysis and solution delivery.
* Drive automation and process improvement for data operations, system upgrades, patching, and access management.
* Contribute to security and compliance initiatives related to data governance, access controls, and audit readiness.
* Mentor and support junior engineers, encouraging a culture of knowledge sharing and operational excellence.
Minimum Qualifications
* Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
* 5-8 years of experience in DevOps, Data Operations, or related IT engineering roles.
*5-8 years of experience Proficiency with cloud platforms (Snowflake, AWS) and Working knowledge of ETL and workflow orchestration tools (Informatica, DBT, Airflow).
*5-8 years of experience Hands-on experience with CI/CD tools (Jenkins, GitLab CI, etc.), scripting (Python, Shell), and configuration management.
* Working knowledge of ETL and workflow orchestration tools (Informatica, DBT, Airflow).
* Familiarity with infrastructure as code (Terraform, CloudFormation, etc.).
* 5+ years of experience with monitoring, logging, and alerting solutions (Prometheus, Grafana, ELK, Monte Carlo, etc.).
*5-8 years of experience with containerization and orchestration (Docker, Kubernetes).
* 5+ years of experience with Strong troubleshooting, incident management, and problem-solving skills.
* Experience working in Agile/Scrum teams and delivering in fast-paced environments.
Preferred Qualifications
* Experience supporting data warehouse or analytics platforms in enterprise settings.
* Knowledge of data quality, security, and governance frameworks.
* Familiarity with automation tools and standard methodologies for operational efficiency.
* Understanding of data pipelines, modeling, and analytics.
* Excellent communication, collaboration, and documentation skills.
**Why Cisco?**
At Cisco, we're revolutionizing how data and infrastructure connect and protect organizations in the AI era - and beyond. We've been innovating fearlessly for 40 years to create solutions that power how humans and technology work together across the physical and digital worlds. These solutions provide customers with unparalleled security, visibility, and insights across the entire digital footprint.
Fueled by the depth and breadth of our technology, we experiment and create meaningful solutions. Add to that our worldwide network of doers and experts, and you'll see that the opportunities to grow and build are limitless. We work as a team, collaborating with empathy to make really big things happen on a global scale. Because our solutions are everywhere, our impact is everywhere.
We are Cisco, and our power starts with you.
**Message to applicants applying to work in the U.S. and/or Canada:**
The starting salary range posted for this position is $165,000.00 to $241,400.00 and reflects the projected salary range for new hires in this position in U.S. and/or Canada locations, not including incentive compensation*, equity, or benefits.
Individual pay is determined by the candidate's hiring location, market conditions, job-related skillset, experience, qualifications, education, certifications, and/or training. The full salary range for certain locations is listed below. For locations not listed below, the recruiter can share more details about compensation for the role in your location during the hiring process.
U.S. employees are offered benefits, subject to Cisco's plan eligibility rules, which include medical, dental and vision insurance, a 401(k) plan with a Cisco matching contribution, paid parental leave, short and long-term disability coverage, and basic life insurance. Please see the Cisco careers site to discover more benefits and perks. Employees may be eligible to receive grants of Cisco restricted stock units, which vest following continued employment with Cisco for defined periods of time.
U.S. employees are eligible for paid time away as described below, subject to Cisco's policies:
+ 10 paid holidays per full calendar year, plus 1 floating holiday for non-exempt employees
+ 1 paid day off for employee's birthday, paid year-end holiday shutdown, and 4 paid days off for personal wellness determined by Cisco
+ Non-exempt employees** receive 16 days of paid vacation time per full calendar year, accrued at rate of 4.92 hours per pay period for full-time employees
+ Exempt employees participate in Cisco's flexible vacation time off program, which has no defined limit on how much vacation time eligible employees may use (subject to availability and some business limitations)
+ 80 hours of sick time off provided on hire date and each January 1st thereafter, and up to 80 hours of unused sick time carried forward from one calendar year to the next
+ Additional paid time away may be requested to deal with critical or emergency issues for family members
+ Optional 10 paid days per full calendar year to volunteer
For non-sales roles, employees are also eligible to earn annual bonuses subject to Cisco's policies.
Employees on sales plans earn performance-based incentive pay on top of their base salary, which is split between quota and non-quota components, subject to the applicable Cisco plan. For quota-based incentive pay, Cisco typically pays as follows:
+ .75% of incentive target for each 1% of revenue attainment up to 50% of quota;
+ 1.5% of incentive target for each 1% of attainment between 50% and 75%;
+ 1% of incentive target for each 1% of attainment between 75% and 100%; and
+ Once performance exceeds 100% attainment, incentive rates are at or above 1% for each 1% of attainment with no cap on incentive compensation.
For non-quota-based sales performance elements such as strategic sales objectives, Cisco may pay 0% up to 125% of target. Cisco sales plans do not have a minimum threshold of performance for sales incentive compensation to be paid.
The applicable full salary ranges for this position, by specific state, are listed below:
New York City Metro Area:
$165,000.00 - $277,600.00
Non-Metro New York state & Washington state:
$146,700.00 - $247,000.00
* For quota-based sales roles on Cisco's sales plan, the ranges provided in this posting include base pay and sales target incentive compensation combined.
** Employees in Illinois, whether exempt or non-exempt, will participate in a unique time off program to meet local requirements.
Cisco is an Affirmative Action and Equal Opportunity Employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation, national origin, genetic information, age, disability, veteran status, or any other legally protected basis.
Cisco will consider for employment, on a case by case basis, qualified applicants with arrest and conviction records.
$93k-114k yearly est. 60d+ ago
Looking for a job?
Let Zippia find it for you.
Senior Staff Software Engineer (Hybrid) - ES Threat Detections
Cisco Systems, Inc. 4.8
Data engineer job at Cisco
This role is hybrid with a balance between onsite and remote work. Meet the Team The ES Threat Detections engineering team is home to promising talent and is pivotal in shaping the roadmap for Enterprise Security's content management. Their work on evolving detection methodologies positions them at the forefront of cybersecurity innovation, directly impacting how organizations detect and mitigate threats effectively. The team consists of dedicated engineers who are not only technically skilled but also highly collaborative and fun to work with. They foster a high-performance culture that values innovation, teamwork, and continuous improvement, making it an exciting and rewarding environment.
We are a passionate, collaborative team that cares deeply about our customers and teammates. In this role, you will work directly with Product Management, Architects, our Design and other engineering teams to help derive the best experience for the customer. We have a lean process that focuses on empowering and serving our engineers as opposed to just advising them.
Your Impact
As a Senior Staff Software Engineer, you will lead, inspire, and develop a high-performing team that delivers innovative AI/ML solutions at scale. Your leadership will shape the AI strategy, guide scientific and engineering innovation, and ensure Splunk's products remain at the cutting edge of cybersecurity and observability. You will:
* Develops software consistent with Cisco 'Design Thinking Principles' with a focus on simplification and UX (User Experience) at its core, using secure coding practices, ensuring user privacy, and following software development best practices.
* Partners with other teams including design and product management to create the right solution for the customers.
* Creates technical design documentation to be used by the team as well as contributing to user documentation to be used by end users.
* Debugs and addresses software issues during development and in production systems to support customers.
* Brings new ideas for product innovation and helps improve software development processes.
What You'll Do
* Recognized as an expert within Cisco.
* Proactively identifies and participates in the resolution of complex problems that impact the direction of the business.
* Develops and delivers innovative strategies that benefit customers and/or clients.
* Leads major business projects which impact a region or entire function.
* Contributes to the development of annual organizational objectives/priorities.
* Builds and maintains long-term relationships with key stakeholders to ensure products and services reflect the needs and direction of the business.
* Translates and influences the regional or functional strategic vision into a technical vision with clear engineering priorities, ensuring understanding and alignment across teams.
* May lead projects that span the business segment, function or region.
* May drive large features from technical design through completion.
* Leads the development and implementation of software development lifecycle and agile engineering strategies across applicable teams in anticipation of the changing software development environment.
* Influences technical direction and deliverables across the function.
* Recognizes and makes trade-offs with respect to the whole system and defines new product categories.
* Leads design choices within the context of all Cisco offerings.
* Drives product success through creating metrics and influencing product adoption and direction.
* Writes and sets the standard for functional clean code.
* Develops and drives utilization and support for processes for issue identification and sustainable response, proactive tooling, and continuous improvement for production systems.
* Industry or thought leader on scale, performance, or design patterns.
* May represent or present team outputs at external events.
* Communicates product or program priorities, deadlines, and shifts in requirements in the context of engineering priorities and user needs.
* Negotiates and drives trade-offs in timing, design, and specifications to meet the needs of users and cross-functional partners.
* Leads, mentors, and influences senior engineers.
* Improves knowledge sharing processes.
Minimum Qualifications
* Bachelors + 12 years of related experience, or Masters + 8 years of related experience, or PhD + 5 years of related experience.
* Experience incorporating AI coding tools to streamline and enhance the process and speed of software development.
* Expert using languages such as Python, Java, C/C++, or similar languages.
* Expert of client-side scripting and JavaScript frameworks (React and/or BackboneJS)
* Expertise in front-end technologies (HTML5, CSS3, Responsive Design, etc.)
* A deep understanding of data structures, algorithms, and RESTful APIs.
* A deep understanding of scalable distributed web applications using open source or proprietary technologies.
* Experience integrating agentic AI and assistive AI into web applications.
* Ability to learn new technologies quickly and to understand a wide variety of technical challenges to be solved.
* Strong collaborative and interpersonal skills, specifically a consistent track record to effectively work with others within a dynamic environment.
* Has a wide-range of experiences and sophisticated technical acumen serving as an advisor to management.
Preferred Qualifications
* Proponent of test-driven development (TDD) and understanding of CI/CD technologies.
* Experience with secure coding practices..
* Familiarity with orchestration and cloud stack and technologies like K8s, Kinesis, Kafka.
* SIEM or data platform architecture with an understanding of scale, latency, cost, and schema realities
* Detection Engineering/Threat expertise
* Understanding of MITRE ATT&CK and threat actor TTP and kill chain concepts
* Hands on with SIEM detection languages (SPL, KQL, YARA-L, Sigma)
* Detection as code concepts - versioning, CI/CD, validation
Why Cisco?
At Cisco, we're revolutionizing how data and infrastructure connect and protect organizations in the AI era - and beyond. We've been innovating fearlessly for 40 years to create solutions that power how humans and technology work together across the physical and digital worlds. These solutions provide customers with unparalleled security, visibility, and insights across the entire digital footprint.
Fueled by the depth and breadth of our technology, we experiment and create meaningful solutions. Add to that our worldwide network of doers and experts, and you'll see that the opportunities to grow and build are limitless. We work as a team, collaborating with empathy to make really big things happen on a global scale. Because our solutions are everywhere, our impact is everywhere.
We are Cisco, and our power starts with you.
Message to applicants applying to work in the U.S. and/or Canada:
The starting salary range posted for this position is $210,600.00 to $305,100.00 and reflects the projected salary range for new hires in this position in U.S. and/or Canada locations, not including incentive compensation*, equity, or benefits.
Individual pay is determined by the candidate's hiring location, market conditions, job-related skillset, experience, qualifications, education, certifications, and/or training. The full salary range for certain locations is listed below. For locations not listed below, the recruiter can share more details about compensation for the role in your location during the hiring process.
U.S. employees are offered benefits, subject to Cisco's plan eligibility rules, which include medical, dental and vision insurance, a 401(k) plan with a Cisco matching contribution, paid parental leave, short and long-term disability coverage, and basic life insurance. Please see the Cisco careers site to discover more benefits and perks. Employees may be eligible to receive grants of Cisco restricted stock units, which vest following continued employment with Cisco for defined periods of time.
U.S. employees are eligible for paid time away as described below, subject to Cisco's policies:
* 10 paid holidays per full calendar year, plus 1 floating holiday for non-exempt employees
* 1 paid day off for employee's birthday, paid year-end holiday shutdown, and 4 paid days off for personal wellness determined by Cisco
* Non-exempt employees receive 16 days of paid vacation time per full calendar year, accrued at rate of 4.92 hours per pay period for full-time employees
* Exempt employees participate in Cisco's flexible vacation time off program, which has no defined limit on how much vacation time eligible employees may use (subject to availability and some business limitations)
* 80 hours of sick time off provided on hire date and each January 1st thereafter, and up to 80 hours of unused sick time carried forward from one calendar year to the next
* Additional paid time away may be requested to deal with critical or emergency issues for family members
* Optional 10 paid days per full calendar year to volunteer
For non-sales roles, employees are also eligible to earn annual bonuses subject to Cisco's policies.
Employees on sales plans earn performance-based incentive pay on top of their base salary, which is split between quota and non-quota components, subject to the applicable Cisco plan. For quota-based incentive pay, Cisco typically pays as follows:
* .75% of incentive target for each 1% of revenue attainment up to 50% of quota;
* 1.5% of incentive target for each 1% of attainment between 50% and 75%;
* 1% of incentive target for each 1% of attainment between 75% and 100%; and
* Once performance exceeds 100% attainment, incentive rates are at or above 1% for each 1% of attainment with no cap on incentive compensation.
For non-quota-based sales performance elements such as strategic sales objectives, Cisco may pay 0% up to 125% of target. Cisco sales plans do not have a minimum threshold of performance for sales incentive compensation to be paid.
The applicable full salary ranges for this position, by specific state, are listed below:
New York City Metro Area:
$210,600.00 - $350,800.00
Non-Metro New York state & Washington state:
$189,300.00 - $312,200.00
* For quota-based sales roles on Cisco's sales plan, the ranges provided in this posting include base pay and sales target incentive compensation combined.
Employees in Illinois, whether exempt or non-exempt, will participate in a unique time off program to meet local requirements.
$105k-133k yearly est. 22d ago
Data Science Leader, Analytics
Meta 4.8
Columbus, OH jobs
As a Data Science Manager at Meta, you will play a key role in shaping the future of experiences for billions of people and hundreds of millions of businesses worldwide. You will apply your leadership, project management, analytical, technical, creative, and product intuition skills to one of the largest data sets globally. Your primary focus will be on driving impact through quality, efficiency, and velocity by collaborating with cross-functional partners across Product, Engineering, Research, DataEngineering, Marketing, Sales, Finance, and others.You will use data to understand product and business ecosystems, quantify new opportunities, identify upcoming challenges, and shape product development to bring value to people, businesses, and Meta. You will guide product teams using data and insights, develop hypotheses, and employ rigorous analytical approaches to test them. Additionally, you will tell data-driven stories, convince and influence leaders using clear insights and recommendations, and build credibility as a trusted strategic partner.As a leader, you will inspire, lead, and grow a world-class team of data scientists and data science leaders. By joining Meta, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.
**Required Skills:**
Data Science Leader, Analytics Responsibilities:
1. Drive analytics projects end-to-end in partnership with cross-functional teams to inform and execute product strategy and investment decisions
2. Inspire, lead, and grow a team of data scientists and managers to fulfill long-term vision and goals
3. Actively influence the design of strategy and roadmap within scope, generating and using team insights to set and prioritize longer-term goals
4. Develop understanding of complex systems, industry challenges, and broader trends to identify present and future risks and opportunities
5. Work with large and complex data sets to solve challenging problems using different analytical and statistical approaches
6. Grow analytics expertise around you, upskilling your team, engineers, and others to increase overall team impact
7. Define key metrics for measuring model effectiveness and drive insight to action by identifying focus areas and opportunities to accelerate performance
8. Partner with cross-functional teams to achieve ambitious long-term goals, monitoring performance against growth goals and building experimentation rigor
9. Shape the strategic direction of growth initiatives, investing in data foundations and analytical methods to sharpen understanding of growth levers
**Minimum Qualifications:**
Minimum Qualifications:
10. BS degree in a quantitative discipline (e.g., statistics, operations research, econometrics, computer science, engineering), or BS/MS in a quantitative discipline with equivalent working experience
11. A minimum of 7 years of work experience (3+ years with a Ph.D.) in applied quantitative field doing quantitative analysis, statistical modeling or machine learning in the experimentation space, including including 2+ years of experience managing analytics teams
12. 5+ years of experience in a team leadership role, including 2+ years of experience with people management through layers
13. Proven track record of leading high-performing analytics teams
14. Experience communicating both in low-level technical details as well as high-level strategies
15. Track-record driving product roadmap and execution
16. Experience in cross-functional partnership among teams of Engineering, Design, PM, DataEngineering
**Preferred Qualifications:**
Preferred Qualifications:
17. Proven track record of leading analytics teams that deliver on multiple projects or programs across regions or business groups
18. A minimum of 2 years of experience working on consumer-facing products
19. 10+ years of experience with quantitative analysis, statistical modeling, or machine learning in the experimentation space
20. Master's or Ph.D. degree in Mathematics, Statistics, Computer Science, Engineering, Economics, or another quantitative field
**Public Compensation:**
$210,000/year to $281,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@meta.com.
$210k-281k yearly 60d+ ago
Data Scientist, Analytics (Technical Leadership)
Meta 4.8
Columbus, OH jobs
We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.
**Required Skills:**
Data Scientist, Analytics (Technical Leadership) Responsibilities:
1. Work with complex data sets to solve challenging problems using analytical and statistical approaches
2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies
3. Identify and measure success through goal setting, forecasting, and monitoring key metrics
4. Partner with cross-functional teams to inform and execute product strategy and investment decisions
5. Build long-term vision and strategy for programs and products
6. Collaborate with executives to define and develop data platforms and instrumentation
7. Effectively communicate insights and recommendations to stakeholders
8. Define success metrics, forecast changes, and set team goals
9. Support developing roadmaps and coordinate analytics efforts across teams
**Minimum Qualifications:**
Minimum Qualifications:
10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab)
12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development
13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance
14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods
15. Experience communicating complex technical topics in a clear, precise, and actionable manner
**Preferred Qualifications:**
Preferred Qualifications:
16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy
17. Masters or Ph.D. Degree in a quantitative field
18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research)
19. 10+ years of experience doing complex quantitative analysis in product analytics
**Public Compensation:**
$210,000/year to $281,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@meta.com.
$210k-281k yearly 60d+ ago
Principal Data Engineer, CSS SaaS and Apps Delivery - Analytics
Oracle 4.6
Remote
Oracle CSS is seeking a highly skilled DataEngineer to support enterprise customers adopting Oracle AI Data Platform. In this role, you will design, implement, support, and operationalize advanced, cloud-native data architectures that enable high-performance analytics, Generative AI, and machine learning at scale. You will maintain scalable, secure, and high-performance data solutions that support industry-specific workflows across global customers. You will be supporting data integration, transformation, and delivery process. You will design, implement, and optimize data pipelines. Partnering with data architects, you will support data models and architecture (Medalion architecture) while enforcing the highest standards of quality, governance, and security. Your cloud-native data services and ETL/ELT experience will ensure high-quality data flows that drive intelligent, industry-specific outcomes.
Disclaimer:
Certain US customer or client-facing roles may be required to comply with applicable requirements, such as immunization and occupational health mandates.
Range and benefit information provided in this posting are specific to the stated locations only
US: Hiring Range in USD from: $96,800 to $223,400 per annum. May be eligible for bonus and equity.
Oracle maintains broad salary ranges for its roles in order to account for variations in knowledge, skills, experience, market conditions and locations, as well as reflect Oracle's differing products, industries and lines of business.
Candidates are typically placed into the range based on the preceding factors as well as internal peer equity.
Oracle US offers a comprehensive benefits package which includes the following:
1. Medical, dental, and vision insurance, including expert medical opinion
2. Short term disability and long term disability
3. Life insurance and AD&D
4. Supplemental life insurance (Employee/Spouse/Child)
5. Health care and dependent care Flexible Spending Accounts
6. Pre-tax commuter and parking benefits
7. 401(k) Savings and Investment Plan with company match
8. Paid time off: Flexible Vacation is provided to all eligible employees assigned to a salaried (non-overtime eligible) position. Accrued Vacation is provided to all other employees eligible for vacation benefits. For employees working at least 35 hours per week, the vacation accrual rate is 13 days annually for the first three years of employment and 18 days annually for subsequent years of employment. Vacation accrual is prorated for employees working between 20 and 34 hours per week. Employees working fewer than 20 hours per week are not eligible for vacation.
9. 11 paid holidays
10. Paid sick leave: 72 hours of paid sick leave upon date of hire. Refreshes each calendar year. Unused balance will carry over each year up to a maximum cap of 112 hours.
11. Paid parental leave
12. Adoption assistance
13. Employee Stock Purchase Plan
14. Financial planning and group legal
15. Voluntary benefits including auto, homeowner and pet insurance
The role will generally accept applications for at least three calendar days from the posting date or as long as the job remains posted.
Career Level - IC4
Design, implement, and support end-to-end data ingestion, transformation, and orchestration pipelines leveraging:
OCI Data Lakehouse Services: Object Storage, Lakehouse, Autonomous Data Warehouse
Real-Time Data Movement: Oracle GoldenGate, OCI Streaming, Kafka-compatible services
Data Integration & Orchestration: OCI Data Integration, Data Flow (Apache Spark), Data Catalog, OCI Data Science integration
Implement and support distributed data processing patterns including ELT, event-driven streaming, and micro-batch frameworks
Apply advanced performance engineering techniques for partitioning, indexing, caching, and adaptive query optimization
Design secure and compliant data environments integrating IAM, Vault, KMS, VCN security, and data governance standards
Enable model-ready datasets using feature engineering pipelines, metadata standardization, and lineage automation
Perform troubleshooting and root cause analysis on pipeline failures, optimize pipeline SLAs, and ensure observability via OCI Logging & Monitoring
Collaborate with customer architects and Oracle product teams to accelerate adoption of AI-powered data capabilities
Understands how to handle large blobs of time series data
Can write batch process code in Python, understands dataframes
$96.8k-223.4k yearly Auto-Apply 60d+ ago
Principal Data Engineer
Oracle 4.6
Remote
We are looking for a hands-on engineering expert with deep experience in cloud data ecosystems (OCI, AWS, Azure) and the ability to deliver secure, high-quality, and scalable data solutions. As a Principal DataEngineer within Oracle's Consumer Global Industries Unit, you will architect and build foundational components of a next-generation data platform supporting Oracle Industry Applications. You will be responsible for designing and implementing scalable, high-performance data pipelines, and contribute significantly to data integration, transformation, and delivery processes. You will collaborate with data architects on best practices for data modeling and architecture (including medallion architecture), champion modern engineering methods, and ensure compliance with security, data quality, and governance requirements. Your expertise in cloud-based data services, ETL/ELT, and real-time/batch data processing will directly enable high-value, AI-enriched solutions across multiple industries.
Required Experience:
- Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related discipline.
- 7+ years of experience in dataengineering with demonstrated expertise designing, building, and maintaining large-scale enterprise data platforms.
- Proven hands-on experience with cloud ecosystem data services (OCI, AWS, Azure), their dataengineering tools, and AI/ML service integration.
- Advanced skills in building reliable ETL/ELT pipelines, data integration, transformation, validation, and orchestration frameworks.
- Strong programming skills (e.g., Python, Kafka, Java, Scala) with experience in distributed data processing technologies (e.g. Spark, Databricks, Apache Beam).
- In-depth knowledge of data governance, quality, security, and compliance best practices within cloud-first or hybrid environments.
- Proficiency with industry-standard tools for workflow orchestration (e.g., Airflow), data cataloging, and monitoring.
- Experience in regulated industries and familiarity with relevant compliance standards is a plus.
- Excellent problem-solving, communication, and collaboration skills within global and cross-functional teams.
Disclaimer:
Certain US customer or client-facing roles may be required to comply with applicable requirements, such as immunization and occupational health mandates.
Range and benefit information provided in this posting are specific to the stated locations only
US: Hiring Range in USD from: $96,800 to $223,400 per annum. May be eligible for bonus and equity.
Oracle maintains broad salary ranges for its roles in order to account for variations in knowledge, skills, experience, market conditions and locations, as well as reflect Oracle's differing products, industries and lines of business.
Candidates are typically placed into the range based on the preceding factors as well as internal peer equity.
Oracle US offers a comprehensive benefits package which includes the following:
1. Medical, dental, and vision insurance, including expert medical opinion
2. Short term disability and long term disability
3. Life insurance and AD&D
4. Supplemental life insurance (Employee/Spouse/Child)
5. Health care and dependent care Flexible Spending Accounts
6. Pre-tax commuter and parking benefits
7. 401(k) Savings and Investment Plan with company match
8. Paid time off: Flexible Vacation is provided to all eligible employees assigned to a salaried (non-overtime eligible) position. Accrued Vacation is provided to all other employees eligible for vacation benefits. For employees working at least 35 hours per week, the vacation accrual rate is 13 days annually for the first three years of employment and 18 days annually for subsequent years of employment. Vacation accrual is prorated for employees working between 20 and 34 hours per week. Employees working fewer than 20 hours per week are not eligible for vacation.
9. 11 paid holidays
10. Paid sick leave: 72 hours of paid sick leave upon date of hire. Refreshes each calendar year. Unused balance will carry over each year up to a maximum cap of 112 hours.
11. Paid parental leave
12. Adoption assistance
13. Employee Stock Purchase Plan
14. Financial planning and group legal
15. Voluntary benefits including auto, homeowner and pet insurance
The role will generally accept applications for at least three calendar days from the posting date or as long as the job remains posted.
Career Level - IC4
Key Responsibilities:
- Design and develop robust, scalable, and secure data pipelines (batch and real-time) to ingest, process, and deliver data from multiple sources to Oracle Industry Applications.
- Implement and optimize data architectures in cloud environments (OCI, AWS, Azure) including data lakes, data warehouses, ETL/ELT, and orchestration frameworks.
- Collaborate with data architects to implement best practices for data modeling (conceptual, logical, and physical) and support platform-wide standards such as medallion architecture (Bronze/Silver/Gold).
- Design and establish monitoring, alerting, and data quality frameworks to ensure reliability and accuracy of data pipelines.
- Develop and enforce best practices for data security, lineage, metadata management, and compliance requirements.
- Work closely with analysts, ML engineers, business stakeholders, and cross-functional teams to enable industry-specific data use cases and analytics for Oracle Industry Applications.
- Evaluate, recommend, and onboard new data technologies to continuously improve platform performance and capabilities.
- Lead troubleshooting and root cause analysis for data issues, ensuring continuous platform availability and reliability.
- Mentor and guide junior dataengineers and drive technical excellence and innovation across engineering efforts.
#LI-JF1
$96.8k-223.4k yearly Auto-Apply 60d+ ago
Principal Cloud Architect, AI Computational Data Scientist
Oracle 4.6
Remote
Oracle Cloud Infrastructure (OCI) is a pioneering force in cloud technology, merging the agility of startups with the robustness of an enterprise software leader. Within OCI, the Oracle Ai Infra / Gen AI Cloud Engineering team spearheads innovative solutions at the convergence of artificial intelligence and cloud infrastructure. As part of this team, you'll contribute to large-scale cloud solutions utilizing cutting-edge machine learning technologies, aimed at addressing complex global challenges.
Join us to create innovative solutions using top-notch machine learning technologies to solve global challenges. We're looking for an experienced Principal Applied Data/Computational Scientist to join our Cloud Engineering team for strategic customers. In this role, you'll collaborate with applied scientists and product managers to design, develop, and deploy tailored Gen-AI solutions with an emphasis on Large Language Models (LLMs), Agents, MCP and Retrieval Augmented Generation (RAG) with large OpenSearch clusters. You will be responsible for identifying, solutioning, and implementing AI solutions to the corresponding GPU IaaS or PaaS.
Qualifications and experience
Doctoral or master's degree in computer science or equivalent technical field with 10+ years of experience
Able to optimally communicate technical ideas verbally and in writing (technical proposals, design specs, architecture diagrams and presentations).
Demonstrated experience in designing and implementing scalable AI models and solutions for production, relevant professional experience as end-to-end solutions engineer or architect (dataengineering, data science and ML engineering is a plus), with evidence of close collaborations with PM and Dev teams.
Experience with OpenSearch, Vector databases, PostgreSQL and Kafka Streaming.
Practical experience with setting up and finetuning large OpenSearch Clusters.
Experience in setting up data ingestion pipelines with OpenSearch.
Experience with search algorithms, indexing, optimizing latency and response times.
Practical experience with the latest technologies in LLM and generative AI, such as parameter-efficient fine-tuning, instruction fine-tuning, and advanced prompt engineering techniques like Tree-of-Thoughts.
Familiarity with Agents and Agent frameworks and Model Context Protocol (MCP)
Hands-on experience with emerging LLM frameworks and plugins, such as LangChain, LlamaIndex, VectorStores and Retrievers, LLM Cache, LLMOps (MLFlow), LMQL, Guidance, etc.
Strong publication record, including as a lead author or reviewer, in top-tier journals or conferences.
Ability and passion to mentor and develop junior machine learning engineers.
Proficient in Python and shell scripting tools.
Preferred Qualifications:
PhD/Masters in related field with 5+ years relevant experience
Experience with RAG based solutions architecture. Familiarity in OpenSearch and Vector stores as a knowledge store
Knowledge of LLM and experience delivering, Generative AI And Agent models are a significant plus.
Familiarity and experience with the latest advancements in computer vision and multimodal modeling is a plus.
Experience with semantic search, multi-modal search and conversational search.
Experience in working on a public cloud environment, and in-depth knowledge of IaaS/PaaS industry and competitive capabilities. Experience with popular model training and serving frameworks like KServe, KubeFlow, Triton etc.
Experience with LLM fine-tuning, especially the latest parameter efficient fine-tuning technologies and multi-task serving technologies.
Deep technical understanding of Machine Learning, Deep Learning architectures like Transformers, training methods, and optimizers.
Experience with deep learning frameworks (such as PyTorch, JAX, or TensorFlow) and deep learning architectures (especially Transformers).
Experience in diagnosing, fixing, and resolving issues in AI model training and serving.
Disclaimer:
Certain US customer or client-facing roles may be required to comply with applicable requirements, such as immunization and occupational health mandates.
Range and benefit information provided in this posting are specific to the stated locations only
US: Hiring Range in USD from: $113,100 to $185,100 per annum. May be eligible for equity. Eligible for commission with an estimated pay mix of 70/30.
Oracle maintains broad salary ranges for its roles in order to account for variations in knowledge, skills, experience, market conditions and locations, as well as reflect Oracle's differing products, industries and lines of business.
Candidates are typically placed into the range based on the preceding factors as well as internal peer equity.
Oracle US offers a comprehensive benefits package which includes the following:
1. Medical, dental, and vision insurance, including expert medical opinion
2. Short term disability and long term disability
3. Life insurance and AD&D
4. Supplemental life insurance (Employee/Spouse/Child)
5. Health care and dependent care Flexible Spending Accounts
6. Pre-tax commuter and parking benefits
7. 401(k) Savings and Investment Plan with company match
8. Paid time off: Flexible Vacation is provided to all eligible employees assigned to a salaried (non-overtime eligible) position. Accrued Vacation is provided to all other employees eligible for vacation benefits. For employees working at least 35 hours per week, the vacation accrual rate is 13 days annually for the first three years of employment and 18 days annually for subsequent years of employment. Vacation accrual is prorated for employees working between 20 and 34 hours per week. Employees working fewer than 20 hours per week are not eligible for vacation.
9. 11 paid holidays
10. Paid sick leave: 72 hours of paid sick leave upon date of hire. Refreshes each calendar year. Unused balance will carry over each year up to a maximum cap of 112 hours.
11. Paid parental leave
12. Adoption assistance
13. Employee Stock Purchase Plan
14. Financial planning and group legal
15. Voluntary benefits including auto, homeowner and pet insurance
The role will generally accept applications for at least three calendar days from the posting date or as long as the job remains posted.
Career Level - IC4
Responsibilities
As part of the OCI Gen AI Cloud Engineering team for strategic customers team, you will be responsible for developing innovative Gen AI and data services for our strategic customers. As a Principal Applied Data/Computational Scientist , you'll lead the development of advanced Gen AI solutions using the latest ML technologies combined with Oracle's cloud expertise. Your work will significantly impact sectors like financial services, telecom, healthcare, and code generation by creating distributed, scalable, high-performance solutions for strategic customers.
Work directly with key customers and accompany them on their Gen AI journey - understanding their requirements, help them envision and design and build the right solutions and work together with their ML engineering to remove blockers.
You will dive deep into model structure to optimize model performance and scalability.
You will build state of art solutions with brand new technologies in this fast-evolving area.
You will configure large scale OpenSearch clusters, setting up ingestion pipelines to get the data into the OpenSearch.
You will diagnose, troubleshoot, and resolve issues in AI model training and serving. You may also perform other duties as assigned.
Build re-usable solution patterns and reference solutions / showcases that can apply across multiple customers.
Be an enthusiastic, self-motivated, and a great collaborator.
Be our product evangelist - engage directly with customers and partners, participate and present in external events and conferences, etc.
$113.1k-185.1k yearly Auto-Apply 60d+ ago
Data Engineer 3/4- Supply Chain Analytics
Northrop Grumman 4.7
Virginia jobs
RELOCATION ASSISTANCE: No relocation assistance available CLEARANCE TYPE: NoneTRAVEL: Yes, 10% of the TimeDescriptionAt Northrop Grumman, our employees have incredible opportunities to work on revolutionary systems that impact people's lives around the world today, and for generations to come. Our pioneering and inventive spirit has enabled us to be at the forefront of many technological advancements in our nation's history - from the first flight across the Atlantic Ocean, to stealth bombers, to landing on the moon. We look for people who have bold new ideas, courage and a pioneering spirit to join forces to invent the future, and have fun along the way. Our culture thrives on intellectual curiosity, cognitive diversity and bringing your whole self to work - and we have an insatiable drive to do what others think is impossible. Our employees are not only part of history, they're making history.
At Northrop Grumman, our employees have incredible opportunities to work on revolutionary systems that impact people's lives around the world today, and for generations to come. Our pioneering and inventive spirit has enabled us to be at the forefront of many technological advancements in our nation's history - from the first flight across the Atlantic Ocean, to stealth bombers, to landing on the moon. We look for people who have bold new ideas, courage and a pioneering spirit to join forces to invent the future, and have fun along the way. Our culture thrives on intellectual curiosity, cognitive diversity and bringing your whole self to work - and we have an insatiable drive to do what others think is impossible. Our employees are not only part of history, they're making history.
Join our Insights & Intelligence (i2) organization to contribute directly to our supply chain analytics team. As a Principal DataEngineer, you will be essential in building, maintaining, and optimizing the data pipelines that deliver clean, reliable information from our enterprise systems, particularly SAP, to power mission-critical supply chain analytics. This role is 100% virtual/work from home.
Key Responsibilities
Pipeline Execution: Design, code, test, and deploy production-grade data pipelines using Python/PySpark and SQL within the Databricks environment to ingest and transform Supply Chain data.
SAP Integration: Implement and maintain data extraction processes from SAP ERP systems, focusing on supply chain areas
Data Modeling: Translate high-level design specifications into physical data models and structures, ensuring the effective implementation of dimensional data models tailored for Supply Chain analytics.
Data Reliability: Actively monitor pipeline health, troubleshoot production issues, and execute necessary fixes to ensure high data availability and accuracy for downstream reporting and analytics.
Business Translation: Partner with Senior Engineers and Supply Chain analysts to understand their data needs, assisting in translating complex business questions related to demand planning, inventory, and logistics into specific data requirements.
Quality & Governance: Adhere strictly to defined standards for data quality, security, and governance protocols, including proper documentation and adherence to established code quality practices.
This role can be performed by a level 3 (mid-career) or level 4 (senior-career) professional.
Basic Qualifications for level 3:
Must have, at minimum, a Bachelor's degree in Engineering, Computer Science, Software Engineering, or a related technical field.
Minimum of 5 years of hands-on experience in DataEngineering or a related technical field.
Minimum of 1 year of experience working in Supply Chain, Logistics, or Manufacturing domains.
Hands-on experience developing data integrations from SAP ERP systems, demonstrating a foundational understanding of relevant SAP data structures.
Proficiency in SQL and Python for data manipulation and transformation.
Direct experience with cloud-based platforms such as Databricks and AWS.
Proven experience working within a team using version control systems (e.g., Git) and following agile development practices.
Basic Qualifications for level 4:
Must have, at minimum, a Bachelor's degree in Engineering, Computer Science, Software Engineering, or a related technical field.
Minimum of 8 years of hands-on experience in DataEngineering or a related technical field.
Minimum of 1 year of experience working in Supply Chain, Logistics, or Manufacturing domains.
Hands-on experience developing data integrations from SAP ERP systems, demonstrating a foundational understanding of relevant SAP data structures.
Proficiency in SQL and Python for data manipulation and transformation.
Direct experience with cloud-based platforms such as Databricks and AWS.
Proven experience working within a team using version control systems (e.g., Git) and following agile development practices.
Preferred Qualifications:
Strong working knowledge of dimensional modeling principles.
Familiarity with specific SAP data models and modules.
Understanding of fundamental data governance and Master Data Management (MDM) concepts.
Primary Level Salary Range: $103,600.00 - $155,400.00Secondary Level Salary Range: $129,300.00 - $193,900.00The above salary range represents a general guideline; however, Northrop Grumman considers a number of factors when determining base salary offers such as the scope and responsibilities of the position and the candidate's experience, education, skills and current market conditions.Depending on the position, employees may be eligible for overtime, shift differential, and a discretionary bonus in addition to base pay. Annual bonuses are designed to reward individual contributions as well as allow employees to share in company results. Employees in Vice President or Director positions may be eligible for Long Term Incentives. In addition, Northrop Grumman provides a variety of benefits including health insurance coverage, life and disability insurance, savings plan, Company paid holidays and paid time off (PTO) for vacation and/or personal business.The application period for the job is estimated to be 20 days from the job posting date. However, this timeline may be shortened or extended depending on business needs and the availability of qualified candidates.Northrop Grumman is an Equal Opportunity Employer, making decisions without regard to race, color, religion, creed, sex, sexual orientation, gender identity, marital status, national origin, age, veteran status, disability, or any other protected class. For our complete EEO and pay transparency statement, please visit *********************************** U.S. Citizenship is required for all positions with a government clearance and certain other restricted positions.
$129.3k-193.9k yearly Auto-Apply 60d+ ago
Data Science Leader, Analytics
Meta 4.8
Columbus, OH jobs
As a Data Science Manager at Meta, you will play a key role in shaping the future of experiences for billions of people and hundreds of millions of businesses worldwide. You will apply your leadership, project management, analytical, technical, creative, and product intuition skills to one of the largest data sets globally. Your primary focus will be on driving impact through quality, efficiency, and velocity by collaborating with cross-functional partners across Product, Engineering, Research, DataEngineering, Marketing, Sales, Finance, and others.You will use data to understand product and business ecosystems, quantify new opportunities, identify upcoming challenges, and shape product development to bring value to people, businesses, and Meta. You will guide product teams using data and insights, develop hypotheses, and employ rigorous analytical approaches to test them. Additionally, you will tell data-driven stories, convince and influence leaders using clear insights and recommendations, and build credibility as a trusted strategic partner.As a leader, you will inspire, lead, and grow a world-class team of data scientists and data science leaders. By joining Meta, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.
**Required Skills:**
Data Science Leader, Analytics Responsibilities:
1. Drive analytics projects end-to-end in partnership with cross-functional teams to inform and execute product strategy and investment decisions
2. Inspire, lead, and grow a team of data scientists and managers to fulfill long-term vision and goals
3. Actively influence the design of strategy and roadmap within scope, generating and using team insights to set and prioritize longer-term goals
4. Develop understanding of complex systems, industry challenges, and broader trends to identify present and future risks and opportunities
5. Work with large and complex data sets to solve challenging problems using different analytical and statistical approaches
6. Grow analytics expertise around you, upskilling your team, engineers, and others to increase overall team impact
7. Define key metrics for measuring model effectiveness and drive insight to action by identifying focus areas and opportunities to accelerate performance
8. Partner with cross-functional teams to achieve ambitious long-term goals, monitoring performance against growth goals and building experimentation rigor
9. Shape the strategic direction of growth initiatives, investing in data foundations and analytical methods to sharpen understanding of growth levers
**Minimum Qualifications:**
Minimum Qualifications:
10. BS degree in a quantitative discipline (e.g., statistics, operations research, econometrics, computer science, engineering), or BS/MS in a quantitative discipline with equivalent working experience
11. A minimum of 7 years of work experience (3+ years with a Ph.D.) in applied quantitative field doing quantitative analysis, statistical modeling or machine learning in the experimentation space, including including 2+ years of experience managing analytics teams
12. 5+ years of experience in a team leadership role, including 2+ years of experience with people management through layers
13. Proven track record of leading high-performing analytics teams
14. Experience communicating both in low-level technical details as well as high-level strategies
15. Track-record driving product roadmap and execution
16. Experience in cross-functional partnership among teams of Engineering, Design, PM, DataEngineering
**Preferred Qualifications:**
Preferred Qualifications:
17. Proven track record of leading analytics teams that deliver on multiple projects or programs across regions or business groups
18. A minimum of 2 years of experience working on consumer-facing products
19. 10+ years of experience with quantitative analysis, statistical modeling, or machine learning in the experimentation space
**Public Compensation:**
$210,000/year to $281,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@meta.com.
$210k-281k yearly 60d+ ago
Senior Data Engineer (Clearance required) - Oracle Consulting Services
Oracle 4.6
Columbus, OH jobs
We are seeking a highly experienced and motivated **Senior DataEngineer** join our Consulting Services organization on an exciting new contract. The Senior DataEngineer role will be part of a team of Application, Cloud, and Security engineers delivering a cloud-native, centralized data platform that delivers end-to-end budget traceability.
**Required Skills and Qualifications:**
+ Minimum 8+ **years** of experience in dataengineering, ETL development, or database management
+ Experience with data transformations, data extractions from various sources of data, data processing and data load into Oracle or other relational databases
+ Strong expertise in **Oracle Database technologies** (e.g., Oracle 19c, Oracle Exadata, Oracle Data Integrator - ODI)
+ Proven experience with **PL/SQL** , **Oracle partitioning** , and **performance tuning**
+ Strong knowledge of **ETL pipeline design** and implementation for large-scale data systems
+ Proficiency in **shell scripting** , **Python** , or other automation tools
+ Ability to work with **cross-functional teams** including data scientists, DBAs, and business analysts
+ Strong verbal and written communication skills for documenting and presenting solutions
**Responsibilities**
**Desired Skills and Qualifications:**
+ Experience integrating Oracle databases with **other applications in Cloud environment.**
+ Familiarity with **PL/SQL, SQL** .
+ Exposure to **cloud platforms** such as **Oracle Cloud Infrastructure (OCI)** , **AWS** , or **Azure.**
+ Knowledge of **data security** best practices and regulatory compliance
+ Experience with **DevOps** and **CI/CD pipelines** in a dataengineering context
+ Desired experience with transforming and loading data into **Peoplesoft database structures** in Oracle.
Required Credentials and Experience:
+ Must have the ability to obtain and maintain a TS/SCI security clearance with Polygraph
+ 8+ years of experience relevant to this position
+ Bachelor's Degree in engineering, computer science, business, or related discipline. Master's degree preferred.
+ This role is a full-time, in-person position at client site. We are seeking candidates who reside in the Virginia/Washington DC region.
Disclaimer:
**Certain US customer or client-facing roles may be required to comply with applicable requirements, such as immunization and occupational health mandates.**
**Range and benefit information provided in this posting are specific to the stated locations only**
US: Hiring Range in USD from: $97,500 to $199,500 per annum. May be eligible for bonus and equity.
Oracle maintains broad salary ranges for its roles in order to account for variations in knowledge, skills, experience, market conditions and locations, as well as reflect Oracle's differing products, industries and lines of business.
Candidates are typically placed into the range based on the preceding factors as well as internal peer equity.
Oracle US offers a comprehensive benefits package which includes the following:
1. Medical, dental, and vision insurance, including expert medical opinion
2. Short term disability and long term disability
3. Life insurance and AD&D
4. Supplemental life insurance (Employee/Spouse/Child)
5. Health care and dependent care Flexible Spending Accounts
6. Pre-tax commuter and parking benefits
7. 401(k) Savings and Investment Plan with company match
8. Paid time off: Flexible Vacation is provided to all eligible employees assigned to a salaried (non-overtime eligible) position. Accrued Vacation is provided to all other employees eligible for vacation benefits. For employees working at least 35 hours per week, the vacation accrual rate is 13 days annually for the first three years of employment and 18 days annually for subsequent years of employment. Vacation accrual is prorated for employees working between 20 and 34 hours per week. Employees working fewer than 20 hours per week are not eligible for vacation.
9. 11 paid holidays
10. Paid sick leave: 72 hours of paid sick leave upon date of hire. Refreshes each calendar year. Unused balance will carry over each year up to a maximum cap of 112 hours.
11. Paid parental leave
12. Adoption assistance
13. Employee Stock Purchase Plan
14. Financial planning and group legal
15. Voluntary benefits including auto, homeowner and pet insurance
The role will generally accept applications for at least three calendar days from the posting date or as long as the job remains posted.
Career Level - IC4
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_************* or by calling *************** in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
$97.5k-199.5k yearly 3d ago
Principal Cloud Architect, AI Computational Data Scientist
Oracle 4.6
Columbus, OH jobs
Oracle Cloud Infrastructure (OCI) is a pioneering force in cloud technology, merging the agility of startups with the robustness of an enterprise software leader. Within OCI, the Oracle Ai Infra / Gen AI Cloud Engineering team spearheads innovative solutions at the convergence of artificial intelligence and cloud infrastructure. As part of this team, you'll contribute to large-scale cloud solutions utilizing cutting-edge machine learning technologies, aimed at addressing complex global challenges.
Join us to create innovative solutions using top-notch machine learning technologies to solve global challenges. We're looking for an experienced Principal Applied Data/Computational Scientist to join our Cloud Engineering team for strategic customers. In this role, you'll collaborate with applied scientists and product managers to design, develop, and deploy tailored Gen-AI solutions with an emphasis on Large Language Models (LLMs), Agents, MCP and Retrieval Augmented Generation (RAG) with large OpenSearch clusters. You will be responsible for identifying, solutioning, and implementing AI solutions to the corresponding GPU IaaS or PaaS.
**Qualifications and experience**
+ Doctoral or master's degree in computer science or equivalent technical field with 10+ years of experience
+ Able to optimally communicate technical ideas verbally and in writing (technical proposals, design specs, architecture diagrams and presentations).
+ Demonstrated experience in designing and implementing scalable AI models and solutions for production, relevant professional experience as end-to-end solutions engineer or architect (dataengineering, data science and ML engineering is a plus), with evidence of close collaborations with PM and Dev teams.
+ Experience with OpenSearch, Vector databases, PostgreSQL and Kafka Streaming.
+ Practical experience with setting up and finetuning large OpenSearch Clusters.
+ Experience in setting up data ingestion pipelines with OpenSearch.
+ Experience with search algorithms, indexing, optimizing latency and response times.
+ Practical experience with the latest technologies in LLM and generative AI, such as parameter-efficient fine-tuning, instruction fine-tuning, and advanced prompt engineering techniques like Tree-of-Thoughts.
+ Familiarity with Agents and Agent frameworks and Model Context Protocol (MCP)
+ Hands-on experience with emerging LLM frameworks and plugins, such as LangChain, LlamaIndex, VectorStores and Retrievers, LLM Cache, LLMOps (MLFlow), LMQL, Guidance, etc.
+ Strong publication record, including as a lead author or reviewer, in top-tier journals or conferences.
+ Ability and passion to mentor and develop junior machine learning engineers.
+ Proficient in Python and shell scripting tools.
**Preferred Qualifications** :
+ PhD/Masters in related field with 5+ years relevant experience
+ Experience with RAG based solutions architecture. Familiarity in OpenSearch and Vector stores as a knowledge store
+ Knowledge of LLM and experience delivering, Generative AI And Agent models are a significant plus.
+ Familiarity and experience with the latest advancements in computer vision and multimodal modeling is a plus.
+ Experience with semantic search, multi-modal search and conversational search.
+ Experience in working on a public cloud environment, and in-depth knowledge of IaaS/PaaS industry and competitive capabilities. Experience with popular model training and serving frameworks like KServe, KubeFlow, Triton etc.
+ Experience with LLM fine-tuning, especially the latest parameter efficient fine-tuning technologies and multi-task serving technologies.
+ Deep technical understanding of Machine Learning, Deep Learning architectures like Transformers, training methods, and optimizers.
+ Experience with deep learning frameworks (such as PyTorch, JAX, or TensorFlow) and deep learning architectures (especially Transformers).
+ Experience in diagnosing, fixing, and resolving issues in AI model training and serving.
**Responsibilities**
**Responsibilities**
As part of the **OCI Gen AI Cloud Engineering team** for strategic customers team, you will be responsible for developing innovative Gen AI and data services for our strategic customers. As a Principal Applied Data/Computational Scientist , you'll lead the development of advanced Gen AI solutions using the latest ML technologies combined with Oracle's cloud expertise. Your work will significantly impact sectors like financial services, telecom, healthcare, and code generation by creating distributed, scalable, high-performance solutions for strategic customers.
+ Work directly with key customers and accompany them on their Gen AI journey - understanding their requirements, help them envision and design and build the right solutions and work together with their ML engineering to remove blockers.
+ You will dive deep into model structure to optimize model performance and scalability.
+ You will build state of art solutions with brand new technologies in this fast-evolving area.
+ You will configure large scale OpenSearch clusters, setting up ingestion pipelines to get the data into the OpenSearch.
+ You will diagnose, troubleshoot, and resolve issues in AI model training and serving. You may also perform other duties as assigned.
+ Build re-usable solution patterns and reference solutions / showcases that can apply across multiple customers.
+ Be an enthusiastic, self-motivated, and a great collaborator.
+ Be our product evangelist - engage directly with customers and partners, participate and present in external events and conferences, etc.
Disclaimer:
**Certain US customer or client-facing roles may be required to comply with applicable requirements, such as immunization and occupational health mandates.**
**Range and benefit information provided in this posting are specific to the stated locations only**
US: Hiring Range in USD from: $113,100 to $185,100 per annum. May be eligible for equity. Eligible for commission with an estimated pay mix of 70/30.
Oracle maintains broad salary ranges for its roles in order to account for variations in knowledge, skills, experience, market conditions and locations, as well as reflect Oracle's differing products, industries and lines of business.
Candidates are typically placed into the range based on the preceding factors as well as internal peer equity.
Oracle US offers a comprehensive benefits package which includes the following:
1. Medical, dental, and vision insurance, including expert medical opinion
2. Short term disability and long term disability
3. Life insurance and AD&D
4. Supplemental life insurance (Employee/Spouse/Child)
5. Health care and dependent care Flexible Spending Accounts
6. Pre-tax commuter and parking benefits
7. 401(k) Savings and Investment Plan with company match
8. Paid time off: Flexible Vacation is provided to all eligible employees assigned to a salaried (non-overtime eligible) position. Accrued Vacation is provided to all other employees eligible for vacation benefits. For employees working at least 35 hours per week, the vacation accrual rate is 13 days annually for the first three years of employment and 18 days annually for subsequent years of employment. Vacation accrual is prorated for employees working between 20 and 34 hours per week. Employees working fewer than 20 hours per week are not eligible for vacation.
9. 11 paid holidays
10. Paid sick leave: 72 hours of paid sick leave upon date of hire. Refreshes each calendar year. Unused balance will carry over each year up to a maximum cap of 112 hours.
11. Paid parental leave
12. Adoption assistance
13. Employee Stock Purchase Plan
14. Financial planning and group legal
15. Voluntary benefits including auto, homeowner and pet insurance
The role will generally accept applications for at least three calendar days from the posting date or as long as the job remains posted.
Career Level - IC4
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_************* or by calling *************** in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
$113.1k-185.1k yearly 60d+ ago
Consultant, Data Engineer
IBM 4.7
Columbus, OH jobs
**Introduction** At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.
**Your role and responsibilities**
We are in search of a skilled Consultant DataEngineer to join our expanding team of experts. This role will be pivotal in the design and development of Snowflake Data Cloud solutions, encompassing responsibilities such as constructing data ingestion pipelines, establishing sound data architecture, and implementing stringent data governance and security protocols.
The ideal candidate brings experience as a proficient data pipeline builder and adept data wrangler, deriving satisfaction from optimizing data systems from their foundational stages. Collaborating closely with database architects, data analysts, and data scientists, the DataEngineer will play a crucial role in ensuring a consistent and optimal data delivery architecture across ongoing customer projects.
This position demands a self-directed individual comfortable navigating the diverse data needs of multiple teams, systems, and products. If you are enthusiastic about the prospect of contributing to a startup environment and supporting our customers in their next generation of data initiatives, we invite you to explore this opportunity.
As of April 2025, Hakkoda has been acquired by IBM and will be integrated in the IBM organization. Your recruitment process will be managed by IBM. IBM will be the hiring entity
This role can be performed from anywhere in the US.
**Required technical and professional expertise**
* Bachelor's degree in engineering, computer science or equivalent area
* 3+yrs in related technical roles with experience in data management, database development, ETL, and/or data prep domains.
* Experience developing data warehouses.
* Experience building ETL / ELT ingestion pipelines.
* Proficiency in using cloud platform services for dataengineering tasks, including managed database services (Snowflake and its pros and cons vs Redshift, BigQuery etc) and data processing services (AWS Glue, Azure Data Factory, Google Dataflow).
* Skills in designing and implementing scalable and cost-effective solutions using cloud services, with an understanding of best practices for security and compliance.
* Knowledge of how to manipulate, process and extract value from large disconnected datasets.
* SQL and Python scripting experience require, Scala and Javascript is a plus.
* Cloud experience (AWS, Azure or GCP) is a plus.
* Knowledge of any of the following tools is also a plus: Snowflake, Matillion/Fivetran or DBT.
* Strong interpersonal skills including assertiveness and ability to build strong client relationships.
* Strong project management and organizational skills.
* Ability to support and work with cross-functional and agile teams in a dynamic environment.
* Advanced English required.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
$69k-89k yearly est. 22d ago
Data Engineer (Remote)
Cognizant 4.6
Park City, IL jobs
We are seeking a qualified Dataengineer with master's in data sciences or possessing experience in Python, SQL, Machine Learning algorithms and modeling, and Microsoft Azure, with hands-on experience in LLMs, LangChain, LangGraph, and RAG-based automation
Key Responsibilities
+ Design & Build hybrid agentic solutions using LangGraph agents / LLMs to classify, normalize, validate, and transform heterogeneous data
+ Build components and solutions in Python (Pandas, Scikit-learn, PyTorch, PySpark, LangChain, LangGraph, FastAPI), Databases & SQL
+ Support team on Cloud and deployment tools like Azure Services, Docker, Git, Cursor AI, Replit, Claude Code
+ Developing components on Neo4j graph linking to power graph reasoning and constraint-aware recommendations.
+ Automating the ingestion, transformation, and knowledge-graph update pipeline using n8n to keep data continuously updated.
+ Automate ETL pipelines and deliver FAST based API services
+ Application of vectorized operations and domain-based rule logic
**Required Qualifications**
+ Possess a Masters in Data science with at least 3+ years of experience in the field of AI/ML **.**
+ Demonstrated experience in designing and analyzing business requirements and translating into design components
+ Hands-on experience in technologies like Python, AI/ML, LLMs
+ Good knowledge and awareness on the Vector DBs SQL, Reporting & Analytical platforms
+ Excellent communication skills with the ability to work in a highly complex applications
**Salary and Other Compensation** :
The annual salary for this position is between $80,000 - $90,000 depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
**Benefits** : Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
+ Medical/Dental/Vision/Life Insurance
+ Paid holidays plus Paid Time Off
+ 401(k) plan and contributions
+ Long-term/Short-term Disability
+ Paid Parental Leave
+ Employee Stock Purchase Plan
**Disclaimer:** The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
**Cognizant will only consider applicants for this position who are legally authorized to work in the United States without requiring company sponsorship now or at any time in the future.**
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
$80k-90k yearly 3d ago
Principal Data Engineer
Oracle 4.6
Columbus, OH jobs
We are looking for a hands-on engineering expert with deep experience in cloud data ecosystems (OCI, AWS, Azure) and the ability to deliver secure, high-quality, and scalable data solutions. As a Principal DataEngineer within Oracle's Consumer Global Industries Unit, you will architect and build foundational components of a next-generation data platform supporting Oracle Industry Applications. You will be responsible for designing and implementing scalable, high-performance data pipelines, and contribute significantly to data integration, transformation, and delivery processes. You will collaborate with data architects on best practices for data modeling and architecture (including medallion architecture), champion modern engineering methods, and ensure compliance with security, data quality, and governance requirements. Your expertise in cloud-based data services, ETL/ELT, and real-time/batch data processing will directly enable high-value, AI-enriched solutions across multiple industries.
Required Experience:
- Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related discipline.
- 7+ years of experience in dataengineering with demonstrated expertise designing, building, and maintaining large-scale enterprise data platforms.
- Proven hands-on experience with cloud ecosystem data services (OCI, AWS, Azure), their dataengineering tools, and AI/ML service integration.
- Advanced skills in building reliable ETL/ELT pipelines, data integration, transformation, validation, and orchestration frameworks.
- Strong programming skills (e.g., Python, Kafka, Java, Scala) with experience in distributed data processing technologies (e.g. Spark, Databricks, Apache Beam).
- In-depth knowledge of data governance, quality, security, and compliance best practices within cloud-first or hybrid environments.
- Proficiency with industry-standard tools for workflow orchestration (e.g., Airflow), data cataloging, and monitoring.
- Experience in regulated industries and familiarity with relevant compliance standards is a plus.
- Excellent problem-solving, communication, and collaboration skills within global and cross-functional teams.
**Responsibilities**
Key Responsibilities:
- Design and develop robust, scalable, and secure data pipelines (batch and real-time) to ingest, process, and deliver data from multiple sources to Oracle Industry Applications.
- Implement and optimize data architectures in cloud environments (OCI, AWS, Azure) including data lakes, data warehouses, ETL/ELT, and orchestration frameworks.
- Collaborate with data architects to implement best practices for data modeling (conceptual, logical, and physical) and support platform-wide standards such as medallion architecture (Bronze/Silver/Gold).
- Design and establish monitoring, alerting, and data quality frameworks to ensure reliability and accuracy of data pipelines.
- Develop and enforce best practices for data security, lineage, metadata management, and compliance requirements.
- Work closely with analysts, ML engineers, business stakeholders, and cross-functional teams to enable industry-specific data use cases and analytics for Oracle Industry Applications.
- Evaluate, recommend, and onboard new data technologies to continuously improve platform performance and capabilities.
- Lead troubleshooting and root cause analysis for data issues, ensuring continuous platform availability and reliability.
- Mentor and guide junior dataengineers and drive technical excellence and innovation across engineering efforts.
\#LI-JF1
Disclaimer:
**Certain US customer or client-facing roles may be required to comply with applicable requirements, such as immunization and occupational health mandates.**
**Range and benefit information provided in this posting are specific to the stated locations only**
US: Hiring Range in USD from: $96,800 to $223,400 per annum. May be eligible for bonus and equity.
Oracle maintains broad salary ranges for its roles in order to account for variations in knowledge, skills, experience, market conditions and locations, as well as reflect Oracle's differing products, industries and lines of business.
Candidates are typically placed into the range based on the preceding factors as well as internal peer equity.
Oracle US offers a comprehensive benefits package which includes the following:
1. Medical, dental, and vision insurance, including expert medical opinion
2. Short term disability and long term disability
3. Life insurance and AD&D
4. Supplemental life insurance (Employee/Spouse/Child)
5. Health care and dependent care Flexible Spending Accounts
6. Pre-tax commuter and parking benefits
7. 401(k) Savings and Investment Plan with company match
8. Paid time off: Flexible Vacation is provided to all eligible employees assigned to a salaried (non-overtime eligible) position. Accrued Vacation is provided to all other employees eligible for vacation benefits. For employees working at least 35 hours per week, the vacation accrual rate is 13 days annually for the first three years of employment and 18 days annually for subsequent years of employment. Vacation accrual is prorated for employees working between 20 and 34 hours per week. Employees working fewer than 20 hours per week are not eligible for vacation.
9. 11 paid holidays
10. Paid sick leave: 72 hours of paid sick leave upon date of hire. Refreshes each calendar year. Unused balance will carry over each year up to a maximum cap of 112 hours.
11. Paid parental leave
12. Adoption assistance
13. Employee Stock Purchase Plan
14. Financial planning and group legal
15. Voluntary benefits including auto, homeowner and pet insurance
The role will generally accept applications for at least three calendar days from the posting date or as long as the job remains posted.
Career Level - IC4
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_************* or by calling *************** in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
$77k-102k yearly est. 60d+ ago
Senior Data Engineer (Remote)
Raytheon 4.6
Farmington, CT jobs
Country:
United States of America Hybrid
U.S. Citizen, U.S. Person, or Immigration Status Requirements:
This job requires a U.S. Person. A U.S. Person is a lawful permanent resident as defined in 8 U.S.C. 1101(a)(20) or who is a protected individual as defined by 8 U.S.C. 1324b(a)(3). U.S. citizens, U.S. nationals, U.S. permanent residents, or individuals granted refugee or asylee status in the U.S. are considered U.S. persons. For a complete definition of “U.S. Person” go here: **********************************************************************************************
Security Clearance:
None/Not Required
RTX Corporation is an Aerospace and Defense company that provides advanced systems and services for commercial, military and government customers worldwide. It comprises three industry-leading businesses - Collins Aerospace Systems, Pratt & Whitney, and Raytheon. Its 185,000 employees enable the company to operate at the edge of known science as they imagine and deliver solutions that push the boundaries in quantum physics, electric propulsion, directed energy, hypersonics, avionics and cybersecurity. The company, formed in 2020 through the combination of Raytheon Company and the United Technologies Corporation aerospace businesses, is headquartered in Arlington, VA.
The following position is to join our RTX Enterprise Services team:
Position Overview:
We are seeking a Senior DataEngineer to join RTX in the Enterprise Services Data organization, AI Foundations team. You will play an active role in bringing cutting edge AI, ML, and data capabilities to RTX. We provide capabilities to practitioners across the enterprise used to solve a vast variety of aerospace challenges through:
Working with product leadership to maintain technical capability roadmaps
Serving as technical stakeholders for RTX's AI platform
Evaluating state of the art vendors and external capabilities to recommend for platform availability
Building enablers, products, and services
What You Will Do:
Design, build, and maintain scalable data pipelines that support batch and real-time data processing across structured and unstructured data sources.
Develop and optimize ETL/ELT workflows to ensure efficient data ingestion, transformation, and delivery into data warehouses, data lakes, or lakehouse environments.
Collaborate closely with data architecture, analytics, ML engineering, and product teams to understand data requirements and translate them into reliable technical solutions.
Implement data quality checks, data validation logic, and automated monitoring to ensure accuracy, completeness, and reliability of datasets.
Work with cloud platforms (e.g., Azure and AWS) and distributed computing frameworks (e.g., Spark) to build high-performance data systems.
Design and maintain metadata, lineage, and documentation that provides transparency into the data lifecycle and supports governance & compliance needs.
Optimize data storage, compute, and pipeline performance to reduce cost and increase efficiency.
Implement security and access control standards for sensitive data in alignment with organizational policies and compliance regulations.
Troubleshoot production data issues, perform root cause analysis, and drive permanent fixes rather than temporary workarounds.
Continuously evaluate new tools, technologies, and patterns to improve dataengineering capabilities and align with future scalability requirements.
Qualifications You Must Have:
Bachelor's degree in a STEM discipline with 5+ years of professional experience in dataengineering, software engineering, or a similar role, with a strong focus on building and maintaining robust data pipelines.
5+ years in programming languages, particularly Python, for building and automating data pipelines and analytics data stores.
Experience with Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) processes and tools like Matillion or other replication tool like HVR (Fivetran) or QLIK Replicate.
3+ years relevant experience with cloud computing platforms (AWS and/or Azure.)
Knowledge of data warehousing concepts, data modeling, and schema design.
Understanding of data governance, lineage, quality frameworks, and metadata management.
Must be authorized to work in the U.S. without sponsorship now or in the future. RTX will not offer sponsorship for this position.
25% travel required within USA.
Qualifications We Prefer:
A flexible and patient mindset, interested in working on a variety of multi-disciplinary challenges with varying stakeholders, partners, domains, and toolsets.
Experience with building AI pipelines that bring together structured, semi-structured and unstructured data. This includes pre-processing with extraction, chunking, embedding and grounding strategies, semantic modeling, and getting the data ready for Models and Agentic solutions.
Experience building products that use Large Language Models (LLMs) or LLM APIs.
Hands-on experience implementing production ready enterprise grade GenAI data solutions.
Experience with Git, Github, or similar version control system, Jira, Confluence, and Agile development.
Effective communication and collaboration, especially across development and operations and business stakeholders.
Work with stakeholders to gather, prioritize, and document functional and technical needs before translating them into software solutions.
Prioritize tasks and manage multiple streams of work efficiently.
What We Offer:
Unique to our team:
A welcoming team that works together and supports one another.
Flexible schedule (Core hours with optional 9/80, 4x10, or custom).
A wide variety of technical opportunities in a cutting-edge landscape.
RTX Benefits Highlights:
Obtain professional certificates and degrees through the RTX employee scholar program.
Comprehensive benefits package, including medical, dental, vision, life, short term disability, and long term disability benefits.
Paid time off including holidays, sick time, and parental leave.
401(k) with matching.
Rotational programs.
Reward and recognition programs.
Learn More & Apply Now!
Work Location: Remote
Please consider the following role type definition as you apply for this role:
Remote: This position is currently designated as remote. However, the successful candidate will be required to work from one of the 50 U.S. states (excluding U.S. Territories). Employees who are working in Remote roles will work primarily offsite (from home). An employee may be expected to travel to the site location as needed.
As part of our commitment to maintaining a secure hiring process, candidates may be asked to attend select steps of the interview process in-person at one of our office locations, regardless of whether the role is designated as on-site, hybrid or remote.
The salary range for this role is 107,500 USD - 204,500 USD. The salary range provided is a good faith estimate representative of all experience levels. RTX considers several factors when extending an offer, including but not limited to, the role, function and associated responsibilities, a candidate's work experience, location, education/training, and key skills.Hired applicants may be eligible for benefits, including but not limited to, medical, dental, vision, life insurance, short-term disability, long-term disability, 401(k) match, flexible spending accounts, flexible work schedules, employee assistance program, Employee Scholar Program, parental leave, paid time off, and holidays. Specific benefits are dependent upon the specific business unit as well as whether or not the position is covered by a collective-bargaining agreement.Hired applicants may be eligible for annual short-term and/or long-term incentive compensation programs depending on the level of the position and whether or not it is covered by a collective-bargaining agreement. Payments under these annual programs are not guaranteed and are dependent upon a variety of factors including, but not limited to, individual performance, business unit performance, and/or the company's performance.This role is a U.S.-based role. If the successful candidate resides in a U.S. territory, the appropriate pay structure and benefits will apply.RTX anticipates the application window closing approximately 40 days from the date the notice was posted. However, factors such as candidate flow and business necessity may require RTX to shorten or extend the application window.
RTX is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or veteran status, or any other applicable state or federal protected class. RTX provides affirmative action in employment for qualified Individuals with a Disability and Protected Veterans in compliance with Section 503 of the Rehabilitation Act and the Vietnam Era Veterans' Readjustment Assistance Act.
Privacy Policy and Terms:
Click on this link to read the Policy and Terms
$77k-100k yearly est. Auto-Apply 19d ago
Lead, HR Data Engineer- Visier
L3Harris 4.4
Melbourne, FL jobs
L3Harris is dedicated to recruiting and developing high-performing talent who are passionate about what they do. Our employees are unified in a shared dedication to our customers' mission and quest for professional growth. L3Harris provides an inclusive, engaging environment designed to empower employees and promote work-life success. Fundamental to our culture is an unwavering focus on values, dedication to our communities, and commitment to excellence in everything we do.
L3Harris Technologies is the Trusted Disruptor in the defense industry. With customers' mission-critical needs always in mind, our employees deliver end-to-end technology solutions connecting the space, air, land, sea and cyber domains in the interest of national security.
Job Title: Lead, HR DataEngineer- Visier
Job Code: 32357
Job Location: Remote Opportunity
Job Schedule: 9/80: Employees work 9 out of every 14 days - totaling 80 hours worked - and have every other Friday off
Job Description:
The HR DataEngineer - Visier is responsible for designing, building, and maintaining the data and analytics architecture that powers the organization's Visier People platform. This role ensures the delivery of actionable workforce insights by interpreting, modeling and transforming HR data from source systems and the Snowflake data warehouse. This role collaborates with the HR Business Relationship Management team and leads the onboarding of new data sources to expand analytics capabilities, ensure data quality, and align with enterprise reporting standards. Operating within an Agile development framework, this role transforms raw HR data into trusted, insight-ready assets that drive business decisions.
Essential Functions:
+ Design, develop and maintain Visier People dashboards, data models and metrics to deliver actionable HR insights.
+ Build, optimize, and maintain data pipelines and integrations between HR systems, Snowflake and Visier.
+ Lead the onboarding of new data sources into Visier, including extraction, transformation, validation and deployment.
+ Ensure data governance, accuracy, consistency and compliance across all HR analytics environments.
+ Collaborate with HR Analytics BRM team, COE and IT teams to ensure data models, dashboards and metrics meet business requirements and deliver actionable insights.
+ Participate in Agile development cycles, including sprint planning, backlog management, and iterative solution delivery.
+ Monitor and optimize data pipelines and Visier performance for reliability, efficiency and scalability.
+ Create and maintain documentation for data processes, integrations, and models to ensure knowledge sharing and sustainability.
+ Build visualizations using Palantir Foundry tools and assist business users with testing, troubleshooting, and documentation creation, including data maintenance guides.
+ Hands-on experience with Visier People or comparable HR analytics platforms.
+ Experience with Snowflake (data modeling, SQL development, pipeline integration and performance tuning)
+ Proficiency with ETL/ELT processes, data transformation, and pipeline design (tools such as dbt, Informatica, Alteryx, or Azure Data Factory)
+ Strong understanding of HR data domains (headcount, attrition, talent, compensation) and workforce analytics concepts.
+ Experience working within Agile development frameworks and using collaboration tools (e.g. Jira, Azure, DevOps)
+ Excellent analytical, troubleshooting, and communication skills with both technical and business audiences.
Qualifications:
+ Bachelor's with 9 years prior experience in dataengineering, Graduate Degree with 7 years prior experience in dataengineering. In lieu of a degree, minimum of 13 years of prior related experience in dataengineering.
+ 5+ years of experience in dataengineering or HR analytics development, ideally in HR or people analytics environment.
Preferred Additional Skills:
+ Expertise in building and maintaining pipelines, ETL/ELT processes and data models in Snowflake and HR Systems.
+ Ability to translate HR data into actionable workforce insights using Visier or Power BI.
+ Strong SQL, data transformation, and dashboard/reporting skills, knowledge of data governance, privacy and security.
+ Experience working in Agile development frameworks, managing sprints, backlogs and iterative delivery.
+ Strong attention to detail with ability to troubleshoot complex data issues and ensure accuracy
+ Effective stakeholder management and communication across HR, IT, BRM and analytics teams, ability to operate independently with ownership.
In compliance with pay transparency requirements, the salary range for this role in California, Massachusetts, New Jersey, Washington, and the Greater D.C, Denver, or NYC areas is $129,500-$240,500. The salary range for this role in Colorado state, Hawaii, Illinois, Maryland, Minnesota, New York state, and Vermont is $112,500-$209,500. This is not a guarantee of compensation or salary, as final offer amount may vary based on factors including but not limited to experience and geographic location. L3Harris also offers a variety of benefits, including health and disability insurance, 401(k) match, flexible spending accounts, EAP, education assistance, parental leave, paid time off, and company-paid holidays. The specific programs and options available to an employee may vary depending on date of hire, schedule type, and the applicability of collective bargaining agreements.
#LI-Remote
#LI-NR1
L3Harris Technologies is proud to be an Equal Opportunity Employer. L3Harris is committed to treating all employees and applicants for employment with respect and dignity and maintaining a workplace that is free from unlawful discrimination. All applicants will be considered for employment without regard to race, color, religion, age, national origin, ancestry, ethnicity, gender (including pregnancy, childbirth, breastfeeding or other related medical conditions), gender identity, gender expression, sexual orientation, marital status, veteran status, disability, genetic information, citizenship status, characteristic or membership in any other group protected by federal, state or local laws. L3Harris maintains a drug-free workplace and performs pre-employment substance abuse testing and background checks, where permitted by law.
Please be aware many of our positions require the ability to obtain a security clearance. Security clearances may only be granted to U.S. citizens. In addition, applicants who accept a conditional offer of employment may be subject to government security investigation(s) and must meet eligibility requirements for access to classified information.
By submitting your resume for this position, you understand and agree that L3Harris Technologies may share your resume, as well as any other related personal information or documentation you provide, with its subsidiaries and affiliated companies for the purpose of considering you for other available positions.
L3Harris Technologies is an E-Verify Employer. Please click here for the E-Verify Poster in English (******************************************************************************************** or Spanish (******************************************************************************************** . For information regarding your Right To Work, please click here for English (****************************************************************************************** or Spanish (******************************************************************************************** .
$69k-89k yearly est. 29d ago
Consultant, Data Engineer
IBM 4.7
Cleveland, OH jobs
**Introduction** At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.
**Your role and responsibilities**
We are in search of a skilled Consultant DataEngineer to join our expanding team of experts. This role will be pivotal in the design and development of Snowflake Data Cloud solutions, encompassing responsibilities such as constructing data ingestion pipelines, establishing sound data architecture, and implementing stringent data governance and security protocols.
The ideal candidate brings experience as a proficient data pipeline builder and adept data wrangler, deriving satisfaction from optimizing data systems from their foundational stages. Collaborating closely with database architects, data analysts, and data scientists, the DataEngineer will play a crucial role in ensuring a consistent and optimal data delivery architecture across ongoing customer projects.
This position demands a self-directed individual comfortable navigating the diverse data needs of multiple teams, systems, and products. If you are enthusiastic about the prospect of contributing to a startup environment and supporting our customers in their next generation of data initiatives, we invite you to explore this opportunity.
As of April 2025, Hakkoda has been acquired by IBM and will be integrated in the IBM organization. Your recruitment process will be managed by IBM. IBM will be the hiring entity
This role can be performed from anywhere in the US.
**Required technical and professional expertise**
* Bachelor's degree in engineering, computer science or equivalent area
* 3+yrs in related technical roles with experience in data management, database development, ETL, and/or data prep domains.
* Experience developing data warehouses.
* Experience building ETL / ELT ingestion pipelines.
* Proficiency in using cloud platform services for dataengineering tasks, including managed database services (Snowflake and its pros and cons vs Redshift, BigQuery etc) and data processing services (AWS Glue, Azure Data Factory, Google Dataflow).
* Skills in designing and implementing scalable and cost-effective solutions using cloud services, with an understanding of best practices for security and compliance.
* Knowledge of how to manipulate, process and extract value from large disconnected datasets.
* SQL and Python scripting experience require, Scala and Javascript is a plus.
* Cloud experience (AWS, Azure or GCP) is a plus.
* Knowledge of any of the following tools is also a plus: Snowflake, Matillion/Fivetran or DBT.
* Strong interpersonal skills including assertiveness and ability to build strong client relationships.
* Strong project management and organizational skills.
* Ability to support and work with cross-functional and agile teams in a dynamic environment.
* Advanced English required.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
$70k-92k yearly est. 22d ago
Senior Data Architect
Cognizant Technology Solutions 4.6
Oakland, CA jobs
About the role As a Senior Data Architect, you will make an impact by implementing advanced data solutions within the company. With a focus on Informatica IDMC, Data Warehousing and Cloud technologies, the candidate will leverage their extensive experience to optimize data integration processes.
You will be a valued member of the HealthCare team and work collaboratively with cross-functional teams to enhance data compliance and transform scripting processes, impacting company growth and societal advancement.
In this role, you will:
* Design and lead scalable data warehousing solutions using Informatica IDMC and data lake concepts to support business intelligence.
* Implement and optimize data integration strategies with Informatica Cloud and Cloud Scheduler to ensure seamless, automated data flow.
* Develop and fine-tune SQL queries and Unix Shell scripts to enhance performance, efficiency, and system reliability.
* Collaborate with cross-functional teams and stakeholders to align data solutions with business objectives and translate requirements into technical specifications.
* Drive innovation and knowledge sharing by exploring new technologies, mitigating risks, and facilitating training to strengthen team capabilities.
Work Model:
We strive to provide flexibility wherever possible. Based on this role's business requirements, this is a remote position open to qualified applicants located on the West Coast of the United States. Regardless of your working arrangement, we are committed to supporting a healthy work-life balance through our various wellbeing programs.
What you need to have to be considered:
* Extensive experience in Data Warehousing and Data Lake concepts for scalable, efficient data solutions.
* Strong expertise in Informatica Cloud and Cloud DWH for seamless integration and management.
* Advanced SQL skills to optimize queries and enhance performance.
* Proficiency in Data Integration strategies ensuring reliable data flow across systems.
* Unix Shell Scripting skills to automate processes and improve efficiency.
Salary and Other Compensation:
Applications will be accepted until January 31, 2026.
The annual salary for this position is between $130,000 - $150,000 depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
Certifications Required
Informatica Cloud Data Integration Certification, SQL Certificatio
Benefits:
Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
* Medical/Dental/Vision/Life Insurance
* Paid holidays plus Paid Time Off
* 401(k) plan and contributions
* Long-term/Short-term Disability
* Paid Parental Leave
* Employee Stock Purchase Plan
Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
* Please note, this role is not able to offer visa transfer or sponsorship now or in the future*
$130k-150k yearly 3d ago
Senior Data Architect
Cognizant 4.6
Oakland, CA jobs
**About the role** As a Senior Data Architect, you will make an impact by implementing advanced data solutions within the company. With a focus on Informatica IDMC, Data Warehousing and Cloud technologies, the candidate will leverage their extensive experience to optimize data integration processes.
You will be a valued member of the HealthCare team and work collaboratively with cross-functional teams to enhance data compliance and transform scripting processes, impacting company growth and societal advancement.
**In this role, you will:**
· Design and lead scalable data warehousing solutions using Informatica IDMC and data lake concepts to support business intelligence.
· Implement and optimize data integration strategies with Informatica Cloud and Cloud Scheduler to ensure seamless, automated data flow.
· Develop and fine-tune SQL queries and Unix Shell scripts to enhance performance, efficiency, and system reliability.
· Collaborate with cross-functional teams and stakeholders to align data solutions with business objectives and translate requirements into technical specifications.
· Drive innovation and knowledge sharing by exploring new technologies, mitigating risks, and facilitating training to strengthen team capabilities.
**Work Model:**
We strive to provide flexibility wherever possible. Based on this role's business requirements, this is a remote position open to qualified applicants located on the West Coast of the United States. Regardless of your working arrangement, we are committed to supporting a healthy work-life balance through our various wellbeing programs.
**What you need to have to be considered:**
· Extensive experience in Data Warehousing and Data Lake concepts for scalable, efficient data solutions.
· Strong expertise in Informatica Cloud and Cloud DWH for seamless integration and management.
· Advanced SQL skills to optimize queries and enhance performance.
· Proficiency in Data Integration strategies ensuring reliable data flow across systems.
· Unix Shell Scripting skills to automate processes and improve efficiency.
**Salary and Other Compensation:**
Applications will be accepted until January 31, 2026.
The annual salary for this position is between $130,000 - $150,000 depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
**Certifications Required**
Informatica Cloud Data Integration Certification, SQL Certificatio
**Benefits:**
Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
· Medical/Dental/Vision/Life Insurance
· Paid holidays plus Paid Time Off
· 401(k) plan and contributions
· Long-term/Short-term Disability
· Paid Parental Leave
· Employee Stock Purchase Plan
Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
***Please note, this role is not able to offer visa transfer or sponsorship now or in the future***
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
$130k-150k yearly 3d ago
Data Engineer
Cognizant 4.6
Columbus, OH jobs
We are seeking a motivated DataEngineer to join our DataEngineering team. The ideal candidate will have exposure in Python, and SQL, and will be responsible for designing, developing, and maintaining robust data pipelines for structured, semi-structured, and unstructured data. This role is ideal for someone passionate about building scalable data solutions and enabling advanced analytics across the organization.
**Key Responsibilities**
+ Design, develop, and optimize data pipelines using Python, Spark, and SQL.
+ Ingest, process, and analyze structured (e.g., relational databases), semi-structured (e.g., JSON, XML), and unstructured data (e.g., text, logs, images) from diverse sources.
+ Implement data quality checks, validation, and transformation logic to ensure data integrity and reliability.
+ Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions.
+ Develop and maintain data models, data dictionaries, and technical documentation.
+ Monitor, troubleshoot, and optimize data workflows for performance and scalability.
+ Ensure compliance with data governance, security, and privacy policies.
+ Support data migration, integration, and modernization initiatives, including cloud-based solutions (AWS, Azure, GCP).
+ Automate repetitive dataengineering tasks and contribute to continuous improvement of data infrastructure.
+ Knowledge of any of the cloud platforms (AWS, Azure, GCP).
+ Understanding of data security, encryption, and compliance best practices.
**Required Skills & Qualifications**
+ Bachelor's or Master's degree in Computer Science, Information Systems, DataEngineering, or a related field
+ Good programming skills in Python and SQL
+ Good problem-solving, analytical, and communication skills
+ Ability to work collaboratively in a fast-paced environment
+ Familiarity with data orchestration tools (e.g., Airflow, Prefect)
+ Exposure to data lake and data warehouse architectures (e.g., Snowflake, Databricks, Big query etc.)
+ Knowledge of containerization and CI/CD pipelines (e.g., Docker, Kubernetes, GitHub)
+ Familiarity with data visualization tools
+ Basic understanding of ETL/ELT concepts, data modeling, and data architecture
**Location**
New hires will be hired at the Cognizant office in **Plano, TX or Teaneck, NJ,** where you will work alongside other experienced Cognizant associates delivering technology solutions. Applicants must be willing to relocate to this major geographic area. While we attempt to honor candidate location preferences, business needs and position availability will determine final location assignment.
**Start Date**
New hires will start in **January or February 2026** . While we will attempt to honor candidate start date preferences, business need and position availability will determine final start date assignment. Exact start date will be communicated with enough time for you to plan effectively.
**Salary and Other Compensation:**
Applications are accepted on an ongoing basis.
The annual salary for this position is $65,000.00 depending on experience and other qualifications of the successful candidate. This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
**Why Choose Us?**
Cognizant delivers solutions that draw upon the full power and scale of our associates. You will be supported by high-caliber experts and employ some of the most advanced and patented capabilities. Our associate's diverse backgrounds offer multifaceted perspectives and fuel new ways of thinking. We encourage lively discussions which inspire better results for our clients.
**Benefits**
Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
+ Medical/Dental/Vision/Life Insurance
+ Paid holidays plus Paid Time Off
+ 401(k) plan and contributions
+ Long-term/Short-term Disability
+ Paid Parental Leave
+ Employee Stock Purchase Plan
**Disclaimer**
The hourly rate, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
**Work Authorization**
Due to the nature of this position, Cognizant cannot provide sponsorship for U.S. work authorization (including participation in a CPT/OPT program) for this role.
_Cognizant is always looking for top talent. We are searching for candidates to fill future needs within the business. This job posting represents potential future employment opportunities with Cognizant. Although the position is not currently available, we want to provide you with the opportunity to express your interest in future employment opportunities with Cognizant. If a job opportunity that you may be qualified for becomes available in the future, we will notify you. At that time you can determine whether you would like to apply for the specific open position. Thank you for your interest in Cognizant career opportunities._
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.