Post job

Data engineer jobs in Oregon

- 1,660 jobs
  • Sr. Databricks Data Engineer

    Artech L.L.C 3.4company rating

    Data engineer job in Portland, OR

    We are seeking a highly skilled Databricks Data Engineer with a minimum of 10 years of total experience, including strong expertise in the retail industry. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines and architectures to support advanced analytics and business intelligence initiatives. This role requires proficiency in Python, SQL, cloud platforms, and ETL tools within a retail-focused data ecosystem. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Databricks and Snowflake. Work with Python libraries such as Pandas, NumPy, PySpark, PyOdbc, PyMsSQL, Requests, Boto3, SimpleSalesforce, and JSON for efficient data processing. Optimize and enhance SQL queries, stored procedures, triggers, and schema designs for RDBMS (MSSQL/MySQL) and NoSQL (DynamoDB/MongoDB/Redis) databases. Develop and manage REST APIs to integrate various data sources and applications. Implement AWS cloud solutions using AWS Data Exchange, Athena, Cloud Formation, Lambda, S3, AWS Console, IAM, STS, EC2, and EMR. Utilize ETL tools such as Apache Airflow, AWS Glue, Azure Data Factory, Talend, and Alteryx to orchestrate and automate data workflows. Work with Hadoop and Hive for big data processing and analysis. Collaborate with cross-functional teams to understand business needs and develop efficient data solutions that drive decision-making in the retail domain. Ensure data quality, governance, and security across all data assets and pipelines. Required Qualifications: 10+ years of total experience in data engineering and data processing. 6+ years of hands-on experience in Python programming, specifically for data processing and analytics. 4+ years of experience working with Databricks and Snowflake. 4+ years of expertise in SQL development, performance tuning, and RDBMS/NoSQL databases. 4+ years of experience in designing and managing REST APIs. 2+ years of working experience in AWS data services. 2+ years of hands-on experience with ETL tools like Apache Airflow, AWS Glue, Azure Data Factory, Talend, or Alteryx. 1+ year experience with Hadoop and Hive. Strong understanding of retail industry data needs and best practices. Excellent problem-solving, analytical, and communication skills. Preferred Qualifications: Experience with real-time data processing and streaming technologies. Familiarity with machine learning and AI-driven analytics. Certifications in Databricks, AWS, or Snowflake. This is an exciting opportunity to work on cutting-edge data engineering solutions in a fast-paced retail environment. If you are passionate about leveraging data to drive business success and innovation, we encourage you to apply!
    $99k-141k yearly est. 5d ago
  • Local to Portland, OR: Data Engineer

    It Motives

    Data engineer job in Portland, OR

    No C2C or Sponsorship Data Engineer: Want to work within a local government school system and really make a difference? Our client is looking for an experience Data Engineer who can design, develop, and support complex data integrations, pipelines, models, and cloud-based data systems within the district's enterprise data environment. You will serve as a technical expert in Snowflake engineering, ELT/ETL orchestration, and data quality automation. Does this sound like you, then please apply! We value diversity in the workplace and encourage women, minorities, and veterans to apply. Thank you! Location: Portland, OR Type: 6 month contract Position details: A candidate for this position is expected to engineer, maintain, and optimize data systems and integrations that support equity-centered, data-informed decision-making. Develop, manage, and enhance secure, reliable, and scalable data pipelines and Snowflake-based data platforms that empower educators, central office staff, and school leaders. Ensure high-quality data availability, improve data accuracy, and apply modern data engineering practices that strengthen instructional and operational outcomes. Knowledge of: Advanced principles of data engineering, Snowflake architecture, data warehousing, and cloud-based data systems. SQL, dbt, ELT/ETL concepts, data modeling techniques, and API-driven integrations. Cloud environments (AWS/Azure) and data tools (Python, Workato, Airflow, Git). Data governance, metadata management, role-based access control, and FERPA requirements. Technical documentation, and Agile development practices. Equity-centered data practices and culturally responsive communication. Ability to: Design, build, and maintain efficient, reliable data pipelines and Snowflake workloads. Analyze complex data structures, identify system issues, and implement solutions. Collaborate with cross-functional teams and communicate technical concepts to nontechnical audiences. Ensure data accuracy, security, and compliance with district and legal requirements. Support the District's Racial Educational Equity Policy and contribute to an inclusive work environment. Use programming languages and tools (Python, SQL, dbt, Git, etc.) to develop data solutions. Work independently, maintain confidentiality, and deliver high-quality customer service. Education: Bachelor's degree in computer science, information science, data engineering, or a closely related field. A master's degree in a related discipline is desirable. Experience: Three (3) or more years of professional experience in data engineering, data warehousing, database development, or cloud data platform operations, including experience with Snowflake or a closely related enterprise data warehouse technology. Experience with public-sector or K-12 education environments is preferred. Any combination of education and experience that provides the required knowledge and abilities may be considered. Representative duties: Design, build, and maintain scalable ELT/ETL pipelines into Snowflake, sourcing data from SIS, HR, Finance, transportation, assessment, vendor platforms, and other enterprise systems hosted on Microsoft SQL Servers. Develop and maintain Snowflake data models, schemas, tasks, streams, stored procedures, and transformation logic to support analytics, reporting, and regulatory needs. Implement and monitor data quality frameworks, validation rules, automated tests, and observability tools to ensure accuracy, completeness, and reliability of district data. Collaborate with data architects, analysts, software developers, and cross-departmental stakeholders to translate business requirements into scalable data engineering solutions. Optimize Snowflake performance, including warehouse sizing, query tuning, clustering, and cost management to ensure efficient and cost-effective operations. Manage integrations using tools such as dbt, Workato, SQL, Python scripts, APIs, and cloud-native services; monitor workflows and resolve data pipeline issues. Ensure adherence to data governance policies, privacy requirements, and security standards including FERPA, role-based access, and metadata documentation. Support the development and implementation of districtwide data strategies, standards, and best practices. Analyze complex data issues, troubleshoot system errors, and perform root-cause analysis to identify long-term solutions. Demonstrate a commitment to the Racial Equity and Social Justice Commitment. Framework in daily practices and data governance decisions. Maintain current knowledge of Snowflake capabilities, cloud data engineering trends, emerging technologies, and industry best practices; participate in professional development
    $84k-118k yearly est. 4d ago
  • Software Engineer Qualtrics

    Mainz Brady Group

    Data engineer job in Beaverton, OR

    HYBRID ONISTE IN BEAVERTON, OR! MUST HAVE QUALTRICS EXP We're seeking a skilled and experienced Software Engineer who specializes in Qualtrics. This role will be part of a high-visibility, high-impact initiative to optimize and expand our Qualtrics environment. You'll play a key role in designing, developing, and maintaining scalable solutions that enhance user experience, streamline data collection, and improve reporting accuracy. The ideal candidate has a strong background in Qualtrics architecture, API integrations, and automation-plus a passion for creating efficient, user-friendly tools that empower teams to make data-driven decisions. What we're looking for: 3+ years of hands-on Qualtrics engineering or development experience Strong understanding of survey logic, workflows, APIs, and automation Experience with data visualization and analytics tools (Tableau, Power BI, etc.) Background in software engineering (JavaScript, Python, or similar) Ability to partner cross-functionally with researchers, analysts, and product teams
    $77k-108k yearly est. 4d ago
  • Data Scientist 4

    Lam Research 4.6company rating

    Data engineer job in Tualatin, OR

    Develop tools, metric measurement and assessment methods for performance management and predictive modeling. Develop dashboards for product management and executives to drive faster and better decision making Create accountability models for DPG-wide quality, I&W, inventory, product management KPI's and business operations. Improve DPG-wide quality, install and warranty, and inventory performance consistently from awareness, prioritization, and action through the availability of common data. Collaborate with quality, install, and warranty, and inventory program managers to analyze trends and patterns in data that drive required improvement in key performance indicators (KPIs) Foster growth and utility of Cost of Quality within the company through correlation of I&W data, ECOGS, identification of causal relationships for quality events and discovery of hidden costs throughout the network. Improve data utilization via AI and automation leading to real time resolution and speeding systemic action. Lead and/or advise on multiple projects simultaneously and demonstrate organizational, prioritization, and time management proficiencies. Bachelor's degree with 8+ years of experience; or master's degree with 5+ years' experience; or equivalent experience. Basic understand of AI and machine learning and ability to work with Data Scientist to use AI to solve complex challenging problems leading to efficiency and effectiveness improvements. Ability to define problem statements and objectives, development of analysis solution approach, execution of analysis. Basic knowledge of Lean Six Sigma processes, statistics, or quality systems experience. Ability to work on multiple problems simultaneously. Ability to present conclusions and recommendations to executive audiences. Ownership mindset to drive solutions and positive outcomes. Excellent communication and presentation skills with the ability to present to audiences at multiple levels in the Company. Willingness to adapt best practices via benchmarking. Experience in Semiconductor fabrication, Semiconductor Equipment Operations, or related industries is a plus. Demonstrated ability to change process and methodologies for capturing and interpreting data. Demonstrated success in using structured problem-solving methodologies and quality tools to solve complex problems. Knowledge of programming environments such as Python, R, Matlab, SQL or equivalent. Experience in structured problem-solving methodologies such as PDCA, DMAIC, 8D and quality tools. Our commitment We believe it is important for every person to feel valued, included, and empowered to achieve their full potential. By bringing unique individuals and viewpoints together, we achieve extraordinary results. Lam is committed to and reaffirms support of equal opportunity in employment and non-discrimination in employment policies, practices and procedures on the basis of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex (including pregnancy, childbirth and related medical conditions), gender, gender identity, gender expression, age, sexual orientation, or military and veteran status or any other category protected by applicable federal, state, or local laws. It is the Company's intention to comply with all applicable laws and regulations. Company policy prohibits unlawful discrimination against applicants or employees. Lam offers a variety of work location models based on the needs of each role. Our hybrid roles combine the benefits of on-site collaboration with colleagues and the flexibility to work remotely and fall into two categories - On-site Flex and Virtual Flex. 'On-site Flex' you'll work 3+ days per week on-site at a Lam or customer/supplier location, with the opportunity to work remotely for the balance of the week. 'Virtual Flex' you'll work 1-2 days per week on-site at a Lam or customer/supplier location, and remotely the rest of the time.
    $71k-91k yearly est. 2d ago
  • Data Scientist

    Western Digital 4.4company rating

    Data engineer job in Salem, OR

    ** At Western Digital, our vision is to power global innovation and push the boundaries of technology to make what you thought was once impossible, possible. At our core, Western Digital is a company of problem solvers. People achieve extraordinary things given the right technology. For decades, we've been doing just that-our technology helped people put a man on the moon and capture the first-ever picture of a black hole. We offer an expansive portfolio of technologies, HDDs, and platforms for business, creative professionals, and consumers alike under our Western Digital , WD , WD_BLACK, and SanDisk Professional brands. We are a key partner to some of the largest and highest-growth organizations in the world. From enabling systems to make cities safer and more connected, to powering the data centers behind many of the world's biggest companies and hyperscale cloud providers, to meeting the massive and ever-growing data storage needs of the AI era, Western Digital is fueling a brighter, smarter future. Today's exceptional challenges require your unique skills. Together, we can build the future of data storage. **Job Description** ESSENTIAL DUTIES AND RESPONSIBILITIES + **Business Partnership & Consulting** + Serve as the primary analytics partner to HR and business leaders, understanding their challenges and translating them into analytical solutions. + Provide insights and recommendations that inform decisions on talent strategy, workforce planning, retention, and employee experience. + Build strong relationships with HRBPs, COEs, and leadership teams to ensure alignment on priorities. + Experience advising, presenting to, and serving as a thought partner to senior executives. + **Analytics & Insights** + Develop dashboards, reports, and analyses on workforce metrics (e.g., attrition, DEI, engagement, recruiting, performance). + Translate complex data into clear, actionable insights with strong storytelling and visualization. + Deliver executive-ready materials that connect people data to business outcomes. + Partner cross-functionally with analytics and technical teams to ensure data accuracy, resolve quality issues, and maintain consistent, reliable insights. + **Advanced People Analytics** + Use statistical analysis, predictive modeling, and trend forecasting to identify workforce risks and opportunities. + Partner with HR Technology and Data teams to enhance data quality, governance, and reporting capabilities. + Lead initiatives to evolve people analytics from descriptive to predictive and prescriptive insights. + **Strategy & Enablement** + Guide stakeholders in building a data-driven culture within HR and across the business. + Drive adoption of self-service analytics platforms and democratize access to people insights. **Qualifications** REQUIRED + **Education & Experience** + Bachelor's or Master's in HR, Business, Data Analytics, Industrial/Organizational Psychology, Statistics, or a related field. + 6+ years of experience in People Analytics, HR Analytics, Workforce Planning, or related fields. SKILLS + **Technical Skills** + Strong expertise in data visualization tools (e.g., Tableau, Power BI, Workday People Analytics, Visier). + Advanced Excel, SQL, or Python/R for data analysis preferred. + Understanding of HR systems (Workday, SuccessFactors, etc.) and data structures. + **Business & Consulting Skills** + Exceptional ability to translate data into business insights and recommendations. + Strong stakeholder management, influencing, and storytelling skills. + Experience in partnering with senior leaders to drive data-informed decisions **Additional Information** Western Digital is committed to providing equal opportunities to all applicants and employees and will not discriminate against any applicant or employee based on their race, color, ancestry, religion (including religious dress and grooming standards), sex (including pregnancy, childbirth or related medical conditions, breastfeeding or related medical conditions), gender (including a person's gender identity, gender expression, and gender-related appearance and behavior, whether or not stereotypically associated with the person's assigned sex at birth), age, national origin, sexual orientation, medical condition, marital status (including domestic partnership status), physical disability, mental disability, medical condition, genetic information, protected medical and family care leave, Civil Air Patrol status, military and veteran status, or other legally protected characteristics. We also prohibit harassment of any individual on any of the characteristics listed above. Our non-discrimination policy applies to all aspects of employment. We comply with the laws and regulations set forth in the "Know Your Rights: Workplace Discrimination is Illegal (************************************************************************************** " poster. Our pay transparency policy is available here (****************************************************** . Western Digital thrives on the power and potential of diversity. As a global company, we believe the most effective way to embrace the diversity of our customers and communities is to mirror it from within. We believe the fusion of various perspectives results in the best outcomes for our employees, our company, our customers, and the world around us. We are committed to an inclusive environment where every individual can thrive through a sense of belonging, respect and contribution. Western Digital is committed to offering opportunities to applicants with disabilities and ensuring all candidates can successfully navigate our careers website and our hiring process. Please contact us at jobs.accommodations@wdc.com to advise us of your accommodation request. In your email, please include a description of the specific accommodation you are requesting as well as the job title and requisition number of the position for which you are applying. Based on our experience, we anticipate that the application deadline will be **12/2/2025** (3 months from posting), although we reserve the right to close the application process sooner if we hire an applicant for this position before the application deadline. If we are not able to hire someone from this role before the application deadline, we will update this posting with a new anticipated application deadline. \#LI- VV1 **Compensation & Benefits Details** + An employee's pay position within the salary range may be based on several factors including but not limited to (1) relevant education; qualifications; certifications; and experience; (2) skills, ability, knowledge of the job; (3) performance, contribution and results; (4) geographic location; (5) shift; (6) internal and external equity; and (7) business and organizational needs. + The salary range is what we believe to be the range of possible compensation for this role at the time of this posting. We may ultimately pay more or less than the posted range and this range is only applicable for jobs to be performed in California, Colorado, New York or remote jobs that can be performed in California, Colorado and New York. This range may be modified in the future. + If your position is non-exempt, you are eligible for overtime pay pursuant to company policy and applicable laws. You may also be eligible for shift differential pay, depending on the shift to which you are assigned. + You will be eligible to be considered for bonuses under **either** Western Digital's Short Term Incentive Plan ("STI Plan") or the Sales Incentive Plan ("SIP") which provides incentive awards based on Company and individual performance, depending on your role and your performance. You may be eligible to participate in our annual Long-Term Incentive (LTI) program, which consists of restricted stock units (RSUs) or cash equivalents, pursuant to the terms of the LTI plan. Please note that not all roles are eligible to participate in the LTI program, and not all roles are eligible for equity under the LTI plan. RSU awards are also available to eligible new hires, subject to Western Digital's Standard Terms and Conditions for Restricted Stock Unit Awards. + We offer a comprehensive package of benefits including paid vacation time; paid sick leave; medical/dental/vision insurance; life, accident and disability insurance; tax-advantaged flexible spending and health savings accounts; employee assistance program; other voluntary benefit programs such as supplemental life and AD&D, legal plan, pet insurance, critical illness, accident and hospital indemnity; tuition reimbursement; transit; the Applause Program; employee stock purchase plan; and the Western Digital Savings 401(k) Plan. + **Note:** No amount of pay is considered to be wages or compensation until such amount is earned, vested, and determinable. The amount and availability of any bonus, commission, benefits, or any other form of compensation and benefits that are allocable to a particular employee remains in the Company's sole discretion unless and until paid and may be modified at the Company's sole discretion, consistent with the law. **Notice To Candidates:** Please be aware that Western Digital and its subsidiaries will never request payment as a condition for applying for a position or receiving an offer of employment. Should you encounter any such requests, please report it immediately to Western Digital Ethics Helpline (******************************************************************** or email ****************** .
    $86k-115k yearly est. 7d ago
  • Data Scientist (Technical Leadership)

    Meta 4.8company rating

    Data engineer job in Salem, OR

    We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond. **Required Skills:** Data Scientist (Technical Leadership) Responsibilities: 1. Work with complex data sets to solve challenging problems using analytical and statistical approaches 2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies 3. Identify and measure success through goal setting, forecasting, and monitoring key metrics 4. Partner with cross-functional teams to inform and execute product strategy and investment decisions 5. Build long-term vision and strategy for programs and products 6. Collaborate with executives to define and develop data platforms and instrumentation 7. Effectively communicate insights and recommendations to stakeholders 8. Define success metrics, forecast changes, and set team goals 9. Support developing roadmaps and coordinate analytics efforts across teams **Minimum Qualifications:** Minimum Qualifications: 10. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab) 11. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development 12. Experience with predictive modeling, machine learning, and experimentation/causal inference methods 13. Experience communicating complex technical topics in a clear, precise, and actionable manner **Preferred Qualifications:** Preferred Qualifications: 14. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy 15. Masters or Ph.D. Degree in a quantitative field 16. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research) 17. 10+ years of experience doing complex quantitative analysis in product analytics **Public Compensation:** $206,000/year to $281,000/year + bonus + equity + benefits **Industry:** Internet **Equal Opportunity:** Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment. Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
    $206k-281k yearly 60d+ ago
  • Urgent ETL Architect

    System Canada Technologies

    Data engineer job in Oregon

    SCT resources have a broad range of skills in different technologies. The large skill-set has been made possible by a conscious focus on strengthening our skills base. Every person selected for our team brings something new, something that adds to our offerings. We learn continuously, both on the job and through formal training programs. Company: SystemCanada | ******************** World wide office: Canada - USA - Australia - UK - South Africa - New Zealand - Ireland - Japan All resumes can be send directly to [email protected] Candidate must have legal work authorization to work in US. Job Title: ETL Architect Duration: 6months(extendable) Location: Portland, OR Job Description: • Minimum of 8+ years experience in: ETL design/development using Informatica ETL solutions • Very good knowledge of Oracle, SQL, UNIX and Shell Scripting etc. Knowledge in other databases like DB2, Teradata are good to have • Design, implement, and develop ETL Informatica processes against high-volume and complex data sources • Knowledge of Cognos reporting • Perform Quality Assurance, unit and system testing as well as complete documentation of deliverables • Experience designing, documenting ETL and performance tuning • Need to have good knowledge in EME, multifile system • Able to trouble shoot and create customized jobs • Experience with data cleansing and data quality processes
    $97k-131k yearly est. 60d+ ago
  • Junior Data Scientist

    Leo 3.2company rating

    Data engineer job in Oregon

    Looking for an extremely bright and budding Data Scientist who can work closely with a solid Engineering core team KEY RESPONSIBILITIES Provide data-based solutions of core problems with the help of AI and Machine Learning tools. Take ownership of building data modelling pipelines for scalable and continuous systems. Closely monitor and provide expertise on creating industry best Data Science curriculum and creating exciting project problem statements KEY SKILLS Sharp problem-solving skills with urgency to deliver best quality products along with an alignment with long-term vision of the company Very good hands-on with python and SQL (postgresql) Good working knowledge of backend development in REST Frameworks (Django) Good knowledge and hands-on with different Machine Learning tools and modelling frameworks (Pandas, Keras/Tensorflow, scikit learn, NLP) Excellent inter-personal skills to communicate and present ideas to different verticals and stakeholders. Good written and communication skills. Working knowledge of Data pipeline and data-science model deployment Bonus Points: * Ability to write great documentations Ability to make data driven decisions for any small thing Our Way Of Working * An opportunity to work on something that really matters. A fast-paced environment to learn and grow. High transparency in decision making. High autonomy; freedom to take risks, to experiment, and to fail. We promise a meaningful journey with smart people, with opportunities to learn & grow. Plus, you can sleep peacefully knowing you are impacting lives in a big way, every day!
    $89k-124k yearly est. 60d+ ago
  • Data Scientist, Generative AI

    Amira Learning 3.8company rating

    Data engineer job in Oregon

    REMOTE / FULL TIME Amira Learning accelerates literacy outcomes by delivering the latest reading and neuroscience with AI. As the leader in third-generation edtech, Amira listens to students read out loud, assesses mastery, helps teachers supplement instruction and delivers 1:1 tutoring. Validated by independent university and SEA efficacy research, Amira is the only AI literacy platform proven to achieve gains surpassing 1:1 human tutoring, consistently delivering effect sizes over 0.4. Rooted in over thirty years of research, Amira is the first, foremost, and only proven Intelligent Assistant for teachers and AI Reading Tutor for students. The platform serves as a school district's Intelligent Growth Engine, driving instructional coherence by unifying assessment, instruction, and tutoring around the chosen curriculum. Unlike any other edtech tool, Amira continuously identifies each student's skill gaps and collaborates with teachers to build lesson plans aligned with district curricula, pulling directly from the district's high-quality instructional materials. Teachers can finally differentiate instruction with evidence and ease, and students get the 1:1 practice they specifically need, whether they are excelling or working below grade level. Trusted by more than 2,000 districts and working in partnership with twelve state education agencies, Amira is helping 3.5 million students worldwide become motivated and masterful readers. About this role: We are seeking a Data Scientist with expertise in the domain of reading science, education, literacy, and NLP; with practical experience building and utilizing Gen AI (LLM, image, and/or video) models. You will help to create Gen AI based apps that will power the most widely used Intelligent Assistant in U.S. schools, already helping more than 2 million children. We are looking for strong, education focused engineers who have a background using the latest generative AI models, with experience in areas such as prompt engineering, model evaluation; data processing for training and fine-tuning; model alignment, and human-feedback-based model training. Responsibilities include: * Design methods, tools, and infrastructure to enable Amira to interact with students and educators in novel ways. * Define approaches to content creation that will enable Amira to safely assist students to build their reading skills. This includes defining internal pipelines to interact with our content team. * Contribute to experiments, including designing experimental details and hypothesis testing, writing reusable code, running evaluations, and organizing and presenting results. * Work hands on with large, complex codebases, contributing meaningfully to enhance the capabilities of the machine learning team. * Work within a fully distributed (remote) team. * Find mechanisms for enabling the use of the Gen AI to be economically viable given the limited budgets of public schools. Who You Are: * You have a background in early education, reading science, literacy, and/or NLP. * You have at least one year of experience working with LLMs and Gen AI models. * You have a degree in computer science or a related technical area. * You are a proficient Python programmer. * You have created performant Machine Learning models. * You want to continue to be hands-on with LLMs and other Gen AI models over the next few years. * You have a desire to be at a Silicon Valley start-up, with the desire and commitment that requires. * You are able to enjoy working on a remote, distributed team and are a natural collaborator. * You love writing code - creating good products means a lot to you. Working is fun - not a passport to get to the next weekend. Qualifications * Bachelor's degree, and/or relevant experience * 1+ years of Gen AI experience - preferably in the Education SaaS industry * Ability to operate in a highly efficient manner by multitasking in a fast-paced, goal-oriented environment. * Exceptional organizational, analytical, and detail-oriented thinking skills. * Proven track record of meeting/exceeding goals and targets. * Great interpersonal, written and oral communication skills. * Experience working across remote teams. Amira's Culture * Flexibility - We encourage and support you to live and work where you desire. Amira works as a truly distributed team. We worked remotely before COVID and we'll be working remotely after the pandemic is long gone. Our office is Slack. Our coffee room is Zoom. Our team works hard but we work when we want, where we want. * Collaboration - We work together closely, using collaborative tools and periodic face to face get togethers. We believe great software is like movie-making. Lots of talented people with very different skills have to band together to build a great experience. * Lean & Agile -- We believe in ownership and continuous feedback. Yes, we employ Scrum ceremonies. But, what we're really after is using data and learning to be better and to do better for our teachers, students, and players. * Mission-Driven - What's important to us is helping kids. We're about tangible, measured impact. Benefits: * Competitive Salary * Medical, dental, and vision benefits * 401(k) with company matching * Flexible time off * Stock option ownership * Cutting-edge work * The opportunity to help children around the world reach their full potential Commitment to Diversity: Amira Learning serves a diverse group of students and educators across the United States and internationally. We believe every student should have access to a high-quality education and that it takes a diverse group of people with a wide range of experiences to develop and deliver a product that meets that goal. We are proud to be an equal opportunity employer. The posted salary range reflects the minimum and maximum base salary the company reasonably expects to pay for this role. Salary ranges are determined by role, level, and location. Individual pay is based on location, job-related skills, experience, and relevant education or training. We are an equal opportunity employer. We do not discriminate on the basis of race, religion, color, ancestry, national origin, sex, sexual orientation, gender identity or expression, age, disability, medical condition, pregnancy, genetic information, marital status, military service, or any other status protected by law.
    $89k-124k yearly est. 31d ago
  • Data Scientist, NLP

    Datavant

    Data engineer job in Salem, OR

    Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care. By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare. We are looking for a motivated Data Scientist to help Datavant revolutionize the healthcare industry with AI. This is a critical role where the right candidate will have the ability to work on a wide range of problems in the healthcare industry with an unparalleled amount of data. You'll join a team focused on deep medical document understanding, extracting meaning, intent, and structure from unstructured medical and administrative records. Our mission is to build intelligent systems that can reliably interpret complex, messy, and high-stakes healthcare documentation at scale. This role is a unique blend of applied machine learning, NLP, and product thinking. You'll collaborate closely with cross-functional teams to: + Design and develop models to extract entities, detect intents, and understand document structure + Tackle challenges like long-context reasoning, layout-aware NLP, and ambiguous inputs + Evaluate model performance where ground truth is partial, uncertain, or evolving + Shape the roadmap and success metrics for replacing legacy document processing systems with smarter, scalable solutions We operate in a high-trust, high-ownership environment where experimentation and shipping value quickly are key. If you're excited by building systems that make healthcare data more usable, accurate, and safe, please reach out. **Qualifications** + 3+ years of experience with data science and machine learning in an industry setting, particularly in designing and building NLP models. + Proficiency with Python + Experience with the latest in language models (transformers, LLMs, etc.) + Proficiency with standard data analysis toolkits such as SQL, Numpy, Pandas, etc. + Proficiency with deep learning frameworks like PyTorch (preferred) or TensorFlow + Industry experience shepherding ML/AI projects from ideation to delivery + Demonstrated ability to influence company KPIs with AI + Demonstrated ability to navigate ambiguity **Bonus Experience** + Experience with document layout analysis (using vision, NLP, or both). + Experience with Spark/PySpark + Experience with Databricks + Experience in the healthcare industry **Responsibilities** + Play a key role in the success of our products by developing models for document understanding tasks. + Perform error analysis, data cleaning, and other related tasks to improve models. + Collaborate with your team by making recommendations for the development roadmap of a capability. + Work with other data scientists and engineers to optimize machine learning models and insert them into end-to-end pipelines. + Understand product use-cases and define key performance metrics for models according to business requirements. + Set up systems for long-term improvement of models and data quality (e.g. active learning, continuous learning systems, etc.). **After 3 Months, You Will...** + Have a strong grasp of technologies upon which our platform is built. + Be fully integrated into ongoing model development efforts with your team. **After 1 Year, You Will...** + Be independent in reading literature and doing research to develop models for new and existing products. + Have ownership over models internally, communicating with product managers, customer success managers, and engineers to make the model and the encompassing product succeed. + Be a subject matter expert on Datavant's models and a source from which other teams can seek information and recommendations. \#LI-BC1 We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services. The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job. The estimated total cash compensation range for this role is: $136,000-$170,000 USD To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion. This job is not eligible for employment sponsorship. Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay. At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way. Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis. For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
    $136k-170k yearly 10d ago
  • DATA Engineer

    Infinity Outsourcing

    Data engineer job in Oregon

    Requisition Title : DATA Engineer Duration : 3-6 Months Pay Rate : $55-70/hr W2 / C2C Client : Manufacturers Bank Experience : 7 + years Domain - Actimize on Cloud Experience in any Compliance Technology ( AML Transaction Monitoring, CDD, Sanctions Screening etc.) Experience with Actimize SAM on Cloud is a BIG PLUS or Actimize on Prem Experience with any of the core banking platforms - FIS, Mission Lane, Fircosoft Experience with GCP/Azure, ETL, Operations Data Store (ODS) Must have skills: Experience in any Compliance Technology ( AML Transaction Monitoring, CDD, Sanctions Screening etc.) Experience with Actimize SAM on Cloud is a BIG PLUS or Actimize on Prem Nice to have skills: Experience with GCP/Azure, ETL, Operations Data Store (ODS) Experience with any of the core banking platforms - FIS, Mission Lane, Fircosoft Role : DATA Engineer + (Cloud or GCP) + Any compliance technology + Actimize Compliance Technology (AML Transaction Monitoring, CDD, Sanctions Screening etc.) JOB DESCRIPTION : Need your immediate attention and profiles for the below urgent position , details as mentioned below: Highlighted below are mandatory , we need Data Engineer with any of the below compliance technology experience and GCP. Senior Specialist (Data Engineering) role
    $55-70 hourly 60d+ ago
  • Data Engineer (US)

    Butter Payments

    Data engineer job in Myrtle Point, OR

    At Butter Payments, we're on a mission to eliminate involuntary churn and make recurring payments seamless. Every year, billions of dollars are lost due to failed payments. Butter leverages machine learning, deep financial data partnerships, and behavioral insights to ensure the right payments go through at the right time-without friction. We're backed by world-class investors like Atomic, Norwest Venture Partners, SpringTride, Transpose Platform, and we're growing fast. The Problem Statement: You will work closely with Software Engineering, Machine Learning Engineers and Data Analysts. You help ingest, organize and enable Butter to continue to deliver value on the data we collect. We make sure it arrives in a timely manner, is organized and usable, and is of a high quality. Key to all the work Butter does. Problem Expanded: We ingest 3rd-party data from multiple payment providers, such as Stripe, clean it and normalize it for our schema and machine learning pipelines. We're seeking to build transformation and validation layers as far upstream as possible to ensure a smooth flow of data through our system. The transformation layer will make the data easier to work with for our reporting products and ML models, while the validation layer will ensure the data conforms to our expectations. For example: Is it null, NULL, or 'null'? Is 342 a valid country abbreviation code? As we expand our product offering and ingest data from additional companies and 3rd-party providers the complexity of the challenge will evolve over time, keeping the problem fresh. Scope: You'll get to architect our system and lay the foundation for the future from both a technology and a system design perspective. No longer will data show up without being tested and structured, as you'll create a system that checks its worst tendencies. You'll work closely with our ML and Eng team to ensure the design meets their requirements and that data properly flows through the system. Philosophies: * You strongly believe that action creates information. * You want to work on a small team and have lots of responsibility. * You look forward to being scrappy and enjoy overcoming challenges. Requirements: * 5+ years of experience delivering value through data at an early stage, high growth startup (ideally within the payments industry) * Prior experience with cloud environments such as AWS or GCP * Strong Data Modeling experience, and familiarity with different approaches (Intermediate and up) * Strong SQL skills * Strong Python skills * Prior experience with workflow orchestration tools (Prefectm, Airflow, Dagster) * Experience with distributed systems workflows (Temporal, AWS Step functions) * Experience with cloud OLAP providers and optimization (Snowflake, Redshift, BigQuery) * Experience with Kafka pub/sub patterns and data ingestion * Experience with data transformation patterns and common tools (dbt) $145,000 - $165,000 a year 145,000-165,000 (USD) We are focused on building a diverse and inclusive workforce. If you're excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply. * ------- Butter Payments is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, state, or local law. Butter considers all qualified applicants in accordance with the San Francisco Fair Chance Ordinance. Please review our CCPA policies here.
    $145k-165k yearly 50d ago
  • Data Engineer

    Rapinno Tech

    Data engineer job in Oregon

    SUMMARY: We are a community of data scientists and engineers that work to improve the shopping experiences of our customers, both in-store and online. We leverage large volumes of data to develop powerful sciences using cutting edge machine learning, natural language processing, and mathematical techniques to improve products, processes, and systems that positively impact our customers. QUALIFICATIONS, SKILLS, AND EXPERIENCE: Strong organizational and project management skills Python basic to intermediate Azure cloud - basic Databricks basic or ability to quickly learn how to work within Databricks Jupyter notebooks basic to intermediate Git version control basic to intermediate Linux, basic comfortability to work on the command line Ability to edit JSON files Good interpersonal and communication skills 1 or more years of experience working within a highly technical team Key Responsibilities Schedule and execute the templated jobs required to optimize pricing for each grocery category Move data files from cloud environment to on-prem servers Edit existing code, where necessary, to establish pricing specific business rules provided by Kroger stakeholders Load data files into a Dash based interactive dashboard used by EPP and Kroger stakeholders to facilitate pricing optimization Running batch scripts Collaborate with 84.51° EPP consultants to appropriately deploy pricing optimization solution within each scheduled grocery category; attend meetings, in-take business requirements, translate requirements to code, align on and complete action items Adhere to stringent quality assurance and documentation standards (e.g., git, markdown, Confluence)
    $84k-118k yearly est. 60d+ ago
  • Data Engineer Visualization and Solutions

    Daimler Truck North America 4.5company rating

    Data engineer job in Portland, OR

    Inside the Role The TT/S Process, Methods, Tools, and Operations team is seeking a specialist with hands-on experience in software development pipeline metrics, complex ETL workflows, and cross-functional data interpretation. This role requires someone who has previously worked with TT/S's internal data systems, understands existing metric definitions, and can independently maintain and enhance our current visualization ecosystem. This group is part of the Process, Methods, Tools, and Operations (PMTO) department within TT/S. Established in 2021, TT/S is a global software and electronics group that is responsible for all global SW & EE development to deliver world-class software and features. TT/S is a global organization with over 1,400 people across the US, Germany, and India. This position is located in Portland and reports the local PMTO manager. Posting Information We provide a scheduled posting end date to assist our candidates with their application planning. While this date reflects our latest plans, it is subject to change, and postings may be extended or removed earlier than expected. We Take Care of Our Team Position offers a starting salary range of $71,000 - $91,000 USD Pay offered dependent on knowledge, skills, and experience. Benefits include annual bonus program; 401k company contribution with company match up to 6% as well as non-elective company contribution of 3 - 7% depending on age; starting at 4 weeks paid vacation; 13+ calendar holidays; 8 weeks paid parental leave; employee assistance program; comprehensive healthcare plans and wellness programs; onsite fitness (at some locations); tuition assistance and volunteer paid time off; short-term and long-term disability plans. What You Drive At DTNA 1. Software Development Metrics Expertise · Build and maintain data pipelines specifically for software development lifecycle (SDLC) metrics, including backlog flow, cycle time, throughput, defect metrics, and team performance indicators. · Apply knowledge of DTNA's current SDLC tooling, workflows, and metadata structures to ensure metric accuracy and consistency. 2. Advanced Visualization & Reporting · Design dashboard suites currently used within TTS leadership, ensuring continuity with existing visual standards, business rules, and naming conventions. · Maintain and extend existing Tableau/Power BI dashboards that are already deployed to internal teams. 3. Tools & Technologies · Utilize: o SAP HANA + custom SQL tuned for large-scale metric calculations o Alteryx for pipeline automation o AQT for troubleshooting and validating existing data models o Python scripting for metric calculation, anomaly detection, and reproducibility · Maintain and optimize existing ETL workflows built for current DTNA pipeline metrics. 4. Data Integration & ETL Ownership · Independently manage ETL processes already in production, ensuring stability and accuracy. · Integrate with established APIs, internal databases, and SDLC tools (such as Jira or internal equivalents). 5. User Experience & Adoption · Work directly with business owners who rely on the current dashboards and pipelines, incorporating feedback that maintains continuity in design and workflow. · Ensure dashboards remain intuitive for existing stakeholders and align with their current methods of data consumption. 6. Collaboration & Stakeholder Alignment · Work closely with DTNA's global data owners and engineering leadership, leveraging established relationships and existing knowledge of team structures. · Translate metric definitions and data issues into actionable solutions with minimal onboarding. 7. Predictive Analytics & Trend Modeling · Apply domain knowledge of DTNA's historical metric patterns to create forecasting or anomaly-detection models. · Use prior experience with DTNA data to identify realistic trends and noise. Knowledge You Should Bring Bachelor's or Master's in Data Science, Computer Science, Information Systems, or a related field. 0-2 years of related experience Demonstrated prior experience working with DTNA data environments, SDLC metrics, or equivalent enterprise-scale engineering metrics programs. Proven record of building software development metric dashboards using Tableau or Power BI. Proficiency with SAP HANA SQL, Alteryx, and Python for ETL and metric calculations. Strong communication skills for interacting with engineering leadership and cross-functional teams. Experience maintaining existing dashboards, ETL flows, and metric definitions in a production environment. Exceptional Candidates Might Have Direct prior experience supporting TT/S or similar internal groups. Familiarity with established DTNA naming conventions, metric definitions, and internal data sources. Ability to work autonomously with minimal onboarding due to prior exposure to DTNA/DT/TTS systems. Experience collaborating with global data owners across DT Network within TT/S, locally and/or globally. Exposure to a Sr Sw Developer and Agile Coaching Methods Where We Work This position is open to applicants who can work in (or relocate to) the following location(s)- Portland, OR US. Relocation assistance is not available for this position. Schedule Type: Hybrid (4 days per week in-office / 1 day remote). This schedule builds our #OneTeamBestTeam culture, provides an unparalleled customer experience, and creates innovative solutions through in-person collaboration. At Daimler Truck North America, we recognize our world is changing faster than ever before. By listening to the needs of today, we're building to solve with cutting-edge solutions in sustainability and future driving technology across electric, hydrogen and autonomous. These solutions, backed by years of innovative success and achievement, continue DTNA's legacy as the undisputed industry leader. Our evolving brand portfolio is second to none, including Freightliner Trucks, Western Star, Demand Detroit, Thomas Built Buses, Freightliner Custom Chassis, and Financial Services. Together, we work as one team towards our envisioned future - building a cleaner, safer and more efficient tomorrow for all. That is what we are working toward - for all who keep the world moving. Additional Information This position is not open for Visa sponsorship or to existing Visa holders Applicants must be legally authorized to work permanently in the country the position is located in at the time of application Final candidate must successfully complete a criminal background check Final candidate may be required to successfully complete a pre-employment drug screen Contractors, professional services, or other contingent workers should confirm with their local agency if they are eligible to apply for FTE positions EEO - Disabled/Veterans Daimler Truck North America is committed to workforce inclusion and providing an environment where equal employment opportunities are available to all applicants and employees without regard to race, color, sex (including pregnancy), religion, national origin, age, marital status, family relationship, disability, sexual orientation, gender identity and expression (including transgender and transitioning status), genetic information, or veteran status. For an accommodation or special assistance with applying for a posted position, please contact our Human Resources department at ************ or toll free ************. For TTY/TDD enabled call ************ or toll free ************.
    $71k-91k yearly Auto-Apply 4d ago
  • Senior Big Data Engineer

    Genoa Employment Solutions 4.8company rating

    Data engineer job in Beaverton, OR

    Were looking to expand our Big Data Engineering team to keep pace. As a Sr. Big Data Engineer, you will work with a variety of talented teammates and be a driving force in technical initiatives that will accelerate analytics. You will be working on projects that build data artifacts to answer questions about consumer behavior, commerce trends, consumer touchpoint preferences and more!
    $97k-139k yearly est. 60d+ ago
  • Data Scientist

    Eyecarecenterofsalem

    Data engineer job in Portland, OR

    Job DescriptionWe are looking for a Data Scientist to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights. In this role, you should be highly analytical with a knack for analysis, math, and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine learning and research. Your goal will be to help our company analyze trends to make better decisions. Responsibilities Identify valuable data sources and automate collection processes Undertake to preprocess of structured and unstructured data Analyze large amounts of information to discover trends and patterns Build predictive models and machine-learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Propose solutions and strategies to business challenges Collaborate with engineering and product development teams Requirements and skills Proven experience as a Data Scientist or Data Analyst Experience in data mining Understanding of machine learning and operations research Knowledge of R, SQL, and Python; familiarity with Scala, Java, or C++ is an asset Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop) Analytical mind and business acumen Strong math skills (e.g. statistics, algebra) Problem-solving aptitude Excellent communication and presentation skills BSc/BA in Computer Science, Engineering, or relevant field; a graduate degree in Data Science or other quantitative field is preferred
    $73k-104k yearly est. 4d ago
  • Data Engineer

    Panthalassa

    Data engineer job in Portland, OR

    About the Company We are a renewable energy and ocean technology company committed to rapidly developing and deploying technologies that will ensure a sustainable future for Earth by unlocking the vast energy potential of its oceans. Our focus is on capturing civilizational levels of ultra-low-cost renewable energy for applications including computing and affordable renewable fuels delivered to shore. The company is a public benefit corporation headquartered in Portland, Oregon, and backed by leading venture capitalists, philanthropic investors, university endowments, and private investment offices. We operate as an idea meritocracy in which the best ideas change the company's direction on a regular basis. About the Job We are developing a core technology that will operate in the most extreme marine environments for years at a time without human maintenance or intervention. We are seeking a Data Engineer with strong software development skills to join our team developing next-generation ocean energy systems. You will work at the intersection of data analysis, simulation, and engineering-supporting the development of clean energy technologies designed to operate in some of the world's most challenging marine environments. In this role, you'll help build and maintain data analysis and simulation pipelines, support R&D with tools to process and interpret engineering datasets, and contribute to internal software used by cross-functional teams. You'll work closely with senior engineers and simulation experts, gaining exposure to real-world physics problems, computational tools, and large-scale scientific workflows. This is an opportunity for an early-career engineer or developer who's excited to contribute to a high-impact mission, write clean and maintainable code, and grow alongside experienced technical mentors. Our staff have worked at organizations such as SpaceX, Blue Origin, Boeing, Tesla, Apple, Virgin Orbit, Google, Amazon, Microsoft, New Relic, Bridgewater, Raytheon, Disney Imagineering, and the US Army and Air Force, as well as research universities, startups, and small companies across a range of industries. We are organized as a public benefit corporation and are backed by leading venture capital firms, private investors, philanthropic investors, and endowments. We strive to be the best engineering team on the planet and we compensate our team members accordingly. Responsibilities Develop and maintain data analysis tools to support engineering design, simulation, and testing workflows Clean, process, and analyze large datasets from CFD simulations, experiments, and field deployments Collaborate with senior engineers to extract insights from simulation results and translate them into actionable design feedback Write modular, well-documented code in Python to automate repetitive or computational workflows Assist in the development of internal software used for simulation pipeline orchestration and post-processing Support post-processing of CFD results using tools such as OpenFOAM, Star-CCM+, and ParaView Work with HPC or cloud compute environments to run and manage large simulation or data processing jobs Contribute to the development of internal documentation, best practices, and reusable analysis scripts Participate in code reviews, collaborative debugging sessions, and weekly team check-ins to share findings and progress Continuously learn new tools, frameworks, and domain-specific knowledge to grow within a fast-paced R&D team Required Qualifications Legal authorization to work in the United States. Bachelor's or Master's degree in Computer Science, Data Science, Engineering, Physics, Applied Mathematics, or a related field Proficiency in Python and familiarity with key data analysis libraries (e.g., NumPy, pandas, matplotlib) Experience writing clean, well-structured code for scientific or engineering problems Familiarity with software development best practices including version control (e.g., Git) and modular code design Ability to interpret and work with structured datasets from simulations or experiments Strong analytical and problem-solving skills with attention to detail Excellent collaboration and communication skills within technical teams Self-motivated with a desire to learn and take ownership of tasks in a fast-paced environment Preferred Qualifications Experience with CFD tools such as OpenFOAM, Star-CCM+, or ParaView for post-processing Exposure to scientific computing workflows including simulation automation or batch processing Familiarity with HPC environments, Linux-based systems, and bash or Python scripting for automation Understanding of fluid dynamics, ocean engineering, or physics-based modeling Experience building or contributing to internal software tools for data analysis, simulation, or visualization Academic or internship experience involving simulation data pipelines or engineering R&D projects The above qualifications are desired, not required. We encourage you to apply if you are a strong candidate with only some of the desired skills and experience listed. Additional Requirements Occasional extended hours or weekend work to support key milestones. Strong preference for candidates based in Portland, OR. Exceptional remote candidates will be considered. Compensation and Benefits If hired for this full-time role, you will receive: Cash compensation of $110,000-$175,000. Equity in the company. We're all owners and if we're successful, this equity should be far and away the most valuable component of your compensation. A benefits package that helps you take care of yourself and your family, including: Flexible paid time off Health insurance (the company pays 100% of gold level PPO plan for full time employees, their partners, and dependents) Dental insurance (the company pays 33% for full time employees and 100% for their partners and dependents) Vision insurance (the company pays 100% for full time employees, their partners, and dependents) Disability insurance (the company pays 100% for a policy to provide long term financial support if you become disabled) Ability to contribute to tax-advantaged accounts, including 401(k), health FSA, and dependent care FSA Relocation assistance to facilitate your move to Portland (if needed). Location We have a strong preference for candidates based in Portland, OR as this is an in-office role. Our offices, lab and shop, are located in Portland, Oregon.
    $110k-175k yearly Auto-Apply 60d+ ago
  • Sr.Hadoop Developer

    Bridge Tech 4.2company rating

    Data engineer job in Beaverton, OR

    Job DescriptionTypically requires a Bachelors Degree and minimum of 5 years directly relevant work experience Client is embarking on the big data platform in Consumer Digital using Hadoop Distributed File System cluster. As a Sr. Hadoop developer you will work with a variety of talented client team mates and be a driving force for building solutions. You will be working on development projects related to commerce and web analytics. Responsibilities: •Design and implement map reduce jobs to support distributed processing using java, cascading, python, hive and pig; Ability to design and implement end to end solution. •Build libraries, user defined functions, and frameworks around Hadoop •Research, evaluate and utilize new technologies/tools/frameworks around Hadoop eco system •Develop user defined functions to provide custom hive and pig capabilities •Define and build data acquisitions and consumption strategies •Define & develop best practices •Work with support teams in resolving operational & performance issues •Work with architecture/engineering leads and other teams on capacity planning QualificationsQualification: •MS/BS degree in a computer science field or related discipline •6+ years' experience in large-scale software development •1+ year experience in Hadoop •Strong Java programming, shell scripting, Python, and SQL •Strong development skills around Hadoop, MapReduce, Hive, Pig, Impala •Strong understanding of Hadoop internals •Good understanding of AVRO and Json and other compresssion •Experience with build tools such as Maven •Experience with databases like Oracle; •Experience with performance/scalability tuning, algorithms and computational complexity •Experience (at least familiarity) with data warehousing, dimensional modeling and ETL development •Ability to understand and ERDs and relational database schemas •Proven ability to work cross functional teams to deliver appropriate resolution Nice to have: •Experience with open source NOSQL technologies such as HBase and Cassandra •Experience with messaging & complex event processing systems such as Kafka and Storm •Machine learning framework •Statistical analysis with Python, R or similar Additional Information All your information will be kept confidential according to EEO guidelines.
    $90k-118k yearly est. 60d+ ago
  • Sr Data Engineer, PySpark

    The Hertz Corporation 4.3company rating

    Data engineer job in Salem, OR

    **A Day in the Life:** The **Senior Data Engineer, PySpark** will be responsible for building and maintaining data pipelines and workflows that support ML, BI, analytics, and software products. This individual will work closely with data scientists, data engineers, analysts, software developers and SME's within the business to deliver new and exciting products and services. The main objectives are to develop data pipelines and fully automated workflows to drive operational efficiency and effectiveness by enabling data-driven decisions across the organization. This includes fostering collaboration, building partnerships, co-developing products, sharing knowledge, providing insights and valuable predictive information to business teams and leaders to highlight potential risks and opportunities that initiate the drive for change. We expect the starting salary to be around $135k but will be commensurate with experience. **What You'll Do:** TECHNICAL LEADERSHIP + Development of high-quality code for the core data stack including data integration hub, data warehouse and data pipelines. + Build data flows for data acquisition, aggregation, and modeling, using both batch and streaming paradigms + Empower data scientists and data analysts to be as self-sufficient as possible by building core systems and developing reusable library code + Support and optimize data tools and associated cloud environments for consumption by downstream systems, data analysts and data scientists + Ensure code, configuration and other technology artifacts are delivered within agreed time schedules and any potential delays are escalated in advance + Collaborate across developers as part of a SCRUM or Kanban team ensuring collective team productivity + Participate in peer reviews and QA processes to drive higher quality + Ensure that 100% of code is well documented and maintained in source code repository. INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING + Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows. TEAMWORK & COMMUNICATION + Proactively educate others on basic data management concepts such as data governance, master data management, data warehousing, big data, reporting, data quality, and database performance. + Superior & demonstrated team building & development skills to harness powerful teams + Ability to communicate effectively with different levels of leadership within the organization + Provide timely updates so that progress against each individual incident can be updated as required + Write and review high quality technical documentation CONTROL & AUDIT + Ensures their workstation and all processes and procedures, follow organization standards CONTINUOUS IMPROVEMENT + Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set. **What We're Looking For:** + 5+ years professional experience as a data engineer, software engineer, data analyst, data scientist or related role + Strongly prefer hands on experience with DataBricks or Palantir + Experience with relational and dimensional database modelling (Relational, Kimball, or Data Vault) + Proven experience with all aspects of the Data Pipeline (Data Sourcing, Transformations, Data Quality, Etc...) + Bachelors or Masters in Computer Science, Information Systems, or an engineering field or equivalent work experience + Prefer Travel, transportation, or hospitality experience + Prefer experience with designing application data models for mobile or web applications + Excellent written and verbal communication skills. + Flexibility in scheduling which may include nights, weekends, and holidays preferred + Prefer experience with event driven architectures and data streaming pub/sub technologies such as IBM MQ, Kafka, or Amazon Kinesis. + Strong capabilities in a scripting language such as Python, R, Scala, etc. + Strong capabilities in SQL and experience with stored procedures + Strong interpersonal and communication skills with Agile/Scrum experience. + Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions. + Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels. **What You'll Get:** + Up to 40% off the base rate of any standard Hertz Rental + Paid Time Off + Medical, Dental & Vision plan options + Retirement programs, including 401(k) employer matching + Paid Parental Leave & Adoption Assistance + Employee Assistance Program for employees & family + Educational Reimbursement & Discounts + Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness + Perks & Discounts -Theme Park Tickets, Gym Discounts & more The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world. **US EEO STATEMENT** At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company. Individuals are encouraged to apply for positions because of the characteristics that make them unique. EOE, including disability/veteran
    $135k yearly 15d ago
  • Data Engineer

    Precinmac 3.6company rating

    Data engineer job in Tualatin, OR

    Precinmac owns a family of precision machining companies in the US and Canada. This roles home location is Shields MFG location in Tualatin, Oregon, which is an industry leader and Value-Add, climate-controlled production facility specializing in CNC machining and complex mechanical/optical/laser assembly including clean-room environments. The Data Engineer will play a critical role in supporting all areas of the company by enabling reliable, scalable, and secure information systems. Our businesses deliver specialized manufacturing expertise for OEMs with low-volume/high-mix needs, while also driving higher-volume opportunities through our expanding cell system capabilities. As an IT-driven organization, we rely on robust data and management information systems to ensure efficiency, transparency, and informed decision-making across the enterprise. We offer: A Highly competitive total compensation package Medical (3 medical plans to choose from) Dental Vision Life (Company-paid, and options for additional supplemental) Disability Insurance (company paid short-term and long-term disability) 401(k) with company match A generous paid time off schedule Discretionary quarterly bonus program. We Offer: A Highly competitive total compensation package Medical (3 medical plans to choose from) Dental Vision Life (Company-paid, and options for additional supplemental) Disability Insurance (company paid short-term and long-term disability) 401(k) with company match, A generous paid time off schedule Discretionary quarterly bonus program. Date Engineer We are looking for a highly motivated Data Engineer to join our growing Data Governance & Analytics team. In this role, you will work closely with senior engineers, architects, and business stakeholders to design and deliver scalable data solutions that power critical business insights and innovation. If you are passionate about building robust data pipelines, ensuring data quality, and leveraging cutting-edge cloud technologies, this is the role for you. Key Responsibilities: Partner with the Senior Data Engineer to design, build, and maintain scalable ETL pipelines and dataflows that adhere to enterprise governance and quality standards. Implement data modeling, normalization, and metadata management practices to ensure consistency and usability across data platforms. Leverage Azure Data Factory (ADF), Databricks, and Apache Spark to process and transform large volumes of structured and unstructured data. Integrate data from diverse sources using RESTful APIs and other ingestion methods. Apply advanced SQL expertise for querying, performance tuning, and ensuring data integrity (both T-SQL and PL/SQL). Collaborate with business teams and data governance groups to enforce data quality, lineage, and compliance standards. Contribute to the Agile development lifecycle, participating in sprint planning, stand-ups, and retrospectives. Partner with data architects, analysts, and business leaders to design and deliver solutions aligned with organizational goals. Provide technical expertise in Python and other scripting languages to automate data workflows. Promote best practices in data governance, security, and stewardship across the enterprise. Required Skills & Experience: Proven experience in data engineering with exposure to data governance frameworks (preferably GCCHI). Strong proficiency with Azure Data Factory, Azure Databricks, Apache Spark, and Python. Solid expertise in SQL (query optimization, performance tuning, complex joins, stored procedures) across T-SQL and PL/SQL. Hands-on experience with ETL pipelines, dataflows, normalization, and data modeling. Familiarity with RESTful API integration for data ingestion. Experience contributing to Agile teams and sprint-based deliverables. Strong understanding of data structures, metadata management, and governance best practices. Practical experience automating workflows with Python scripting. Preferred Skills: Experience with data cataloging, data lineage, and master data management (MDM) tools. Knowledge of Azure Synapse Analytics, Power BI, or other BI/visualization platforms. Familiarity with CI/CD practices for data pipelines. Exposure to data privacy regulations (CMMC, NIST 800). Why Join Us? Work on impactful projects that enable smarter business decisions. Gain hands-on experience with advanced Azure technologies and modern data tools. Be part of a collaborative, agile team where innovation and continuous improvement are valued. Grow your career in a forward-looking data-driven organization. Work Setting: General office setting with typical moderate noise levels in a temperature controlled environment. Operates office equipment (computer, fax, copier, phone) as required to perform essential job functions. Precinmac is an equal opportunity, affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
    $100k-140k yearly est. Auto-Apply 60d+ ago

Learn more about data engineer jobs

Do you work as a data engineer?

What are the top employers for data engineer in OR?

Top 10 Data Engineer companies in OR

  1. Genoa

  2. Ernst & Young

  3. CVS Health

  4. Meta

  5. Nike

  6. Veeva Systems

  7. The Hertz Corporation

  8. Oracle

  9. Cambia Health Solutions

  10. Capital One

Job type you want
Full Time
Part Time
Internship
Temporary

Browse data engineer jobs in oregon by city

All data engineer jobs

Jobs in Oregon