Post job

Data engineer jobs in Portland, OR

- 1,367 jobs
All
Data Engineer
Data Architect
Data Scientist
Software Engineer
Applications Support Engineer
Hadoop Developer
Senior Data Architect
Data Warehouse Developer
  • Applied Data Scientists

    Mercor

    Data engineer job in Tigard, OR

    **1\. Role Overview**Mercor is seeking applied data science professionals to support a strategic analytics initiative with a global enterprise. This contract-based opportunity focuses on extracting insights, building statistical models, and informing business decisions through advanced data science techniques. Freelancers will translate complex datasets into actionable outcomes using tools like Python, SQL, and visualization platforms. This short-term engagement emphasizes experimentation, modeling, and stakeholder communication - distinct from production ML engineering. **2\. Key Responsibilities** ● Translate business questions into data science problems and analytical workflows ● Conduct data wrangling, exploratory analysis, and hypothesis testing ● Develop statistical models and predictive tools for decision support ● Create compelling data visualizations and dashboards for business users ● Present findings and recommendations to non-technical stakeholders **3\. Ideal Qualifications** ● 5+ years of applied data science or analytics experience in business settings ● Proficiency in Python or R (pandas, NumPy, Jupyter) and strong SQL skills ● Experience with data visualization tools (e.g., Tableau, Power BI) ● Solid understanding of statistical modeling, experimentation, and A/B testing ● Strong communication skills for translating technical work into strategic insights **4\. More About the Opportunity** ● Remote ● **Expected commitment: min 30 hours/week ● Project duration: ~6 weeks** **5\. Compensation & Contract Terms** ● $75-100/hour ● Paid weekly via Stripe Connect ● You'll be classified as an independent contractor **6\. Application Process** ● Submit your resume followed by domain expertise interview and short form **7.About Mercor** ● Mercor is a talent marketplace that connects top experts with leading AI labs and research organizations ● Our investors include Benchmark, General Catalyst, Adam D'Angelo, Larry Summers, and Jack Dorsey ● Thousands of professionals across domains like law, creatives, engineering, and research have joined Mercor to work on frontier projects shaping the next era of AI
    $75-100 hourly 17d ago
  • Software Engineer Qualtrics

    Mainz Brady Group

    Data engineer job in Beaverton, OR

    HYBRID ONISTE IN BEAVERTON, OR! MUST HAVE QUALTRICS EXP We're seeking a skilled and experienced Software Engineer who specializes in Qualtrics. This role will be part of a high-visibility, high-impact initiative to optimize and expand our Qualtrics environment. You'll play a key role in designing, developing, and maintaining scalable solutions that enhance user experience, streamline data collection, and improve reporting accuracy. The ideal candidate has a strong background in Qualtrics architecture, API integrations, and automation-plus a passion for creating efficient, user-friendly tools that empower teams to make data-driven decisions. What we're looking for: 3+ years of hands-on Qualtrics engineering or development experience Strong understanding of survey logic, workflows, APIs, and automation Experience with data visualization and analytics tools (Tableau, Power BI, etc.) Background in software engineering (JavaScript, Python, or similar) Ability to partner cross-functionally with researchers, analysts, and product teams
    $77k-108k yearly est. 4d ago
  • Application Support Engineer

    Cvent 4.3company rating

    Data engineer job in Portland, OR

    Pacific Time zone working hours (9am - 6pm PT) Our Culture and Impact Cvent is a leading meetings, events, and hospitality technology provider with more than 5,000+ employees and 24,000+ customers worldwide, including 60% of the Fortune 500. Founded in 1999, Cvent delivers a comprehensive event marketing and management platform for marketers and event professionals and offers software solutions to hotels, special event venues and destinations to help them grow their group/MICE and corporate travel business. Our technology brings millions of people together at events around the world. In short, we're transforming the meetings and events industry through innovative technology that powers the human connection. Cvent's strength lies in its people, fostering a culture where everyone is encouraged to think like entrepreneurs, taking risks and making decisions confidently. We value diverse perspectives and celebrate differences, working together with colleagues and clients to build strong connections. AI at Cvent: Leading the Future Are you ready to shape the future of work at the intersection of human expertise and AI innovation? At Cvent, we're committed to continuous learning and adaptation-AI isn't just a tool for us, it's part of our DNA. We're looking for candidates who are eager to evolve alongside technology. If you love to experiment boldly, share your discoveries, and help define best practices for AI-augmented work, you'll thrive here. Our team values professionals who thoughtfully integrate AI into their daily work, delivering exceptional results while relying on the human judgment and creativity that drive real innovation. Throughout our interview process, you'll have the chance to demonstrate how you use AI to learn, iterate, and amplify your impact. If you're excited to be part of a team that's leading the way in AI-powered collaboration, we'd love to meet you. Do you have a passion for technology? Do you enjoy solving real world problems? Do you have a helpful and inquisitive mindset? If you answered yes to all three questions, then keep reading! This entry level Application Support Engineer opportunity is a jack of all technology trades. This role is technical in nature and provides exposure to all major aspects of cloud-based software (debugging/coding, test, networking, database/infrastructure). Our end goal is to ensure our customers have the best possible experience with our products from a technical perspective. This entails doing everything we can to make sure that our products are bug free and have top-notch performance. As far as important candidate qualities go, strong communication and the ability to work with multiple teams is a must. We care more about your attitude and aptitude than the tools and technologies you have used in the past. In This Role, You Will: Provide top-tier software support for Cvent's product offerings to our customer service team and clients. This role is not customer facing. Assist operations and development teams with debugging software issues. Query databases to generate and analyze data for reporting and troubleshooting purposes. Work with our sales engineering team to ensure the successful operation of our partner and client integrations. Work with multiple teams to find, analyze, and resolve client issues. Troubleshoot and maintain frontend and backend systems. Monitor, document and report system and performance issues. Facilitate communication between technology teams and other departments on issue status and resolution. Supply in-depth technical and business product knowledge to multiple teams. Weekend on-call support on a rotational basis is required for this position. Here's What You Need: Do not worry if you do not know all the specific technologies listed below! Our training program lasts 3-4 weeks and will help bring you up-to-speed on both our products and the technologies they use. BS in Computer Science, Information Systems or equivalent major with strong academic performance Excellent problem solving and analytical skills. Understanding of relational databases and how to query data using SQL. Working knowledge of HTML/CSS. Understanding of the Software Development Life Cycle. Solid knowledge of at least one object-oriented programming language. Outstanding oral and written communication skills. Ability to convey technical information to a general audience. Aptitude for learning new technologies. Zealous attention to detail. Wondering what other technologies and tools we use? See below. Any experience with these is a plus! Monitoring tools: NewRelic, Splunk, Datadog. Hosting: Amazon Web Services Programming: Java, C#, .Net, Node.js Open source and NoSQL database technologies: Couchbase, Elasticsearch, RabbitMQ APIs: SOAP or REST based Build & deploy technologies: Docker and Jenkins Version control: Git The estimated base salary range for new hires into this role is $85-$120k+ annually + bonus depending on factors such as job-related knowledge, relevant experience, and location. We also offer a competitive benefits package, details of which can be found here.” W e are not able to offer sponsorship for this position
    $85k-120k yearly 3d ago
  • Data Scientist (Technical Leadership)

    Meta 4.8company rating

    Data engineer job in Salem, OR

    We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond. **Required Skills:** Data Scientist (Technical Leadership) Responsibilities: 1. Work with complex data sets to solve challenging problems using analytical and statistical approaches 2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies 3. Identify and measure success through goal setting, forecasting, and monitoring key metrics 4. Partner with cross-functional teams to inform and execute product strategy and investment decisions 5. Build long-term vision and strategy for programs and products 6. Collaborate with executives to define and develop data platforms and instrumentation 7. Effectively communicate insights and recommendations to stakeholders 8. Define success metrics, forecast changes, and set team goals 9. Support developing roadmaps and coordinate analytics efforts across teams **Minimum Qualifications:** Minimum Qualifications: 10. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab) 11. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development 12. Experience with predictive modeling, machine learning, and experimentation/causal inference methods 13. Experience communicating complex technical topics in a clear, precise, and actionable manner **Preferred Qualifications:** Preferred Qualifications: 14. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy 15. Masters or Ph.D. Degree in a quantitative field 16. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research) 17. 10+ years of experience doing complex quantitative analysis in product analytics **Public Compensation:** $206,000/year to $281,000/year + bonus + equity + benefits **Industry:** Internet **Equal Opportunity:** Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment. Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
    $206k-281k yearly 60d+ ago
  • Data Scientist, NLP & Language Models

    Datavant

    Data engineer job in Salem, OR

    Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care. By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare. Datavant is looking for an enthusiastic and meticulous Data Scientist to join our growing team, which builds machine learning models for use across Datavant in multiple verticals and for multiple customer types. As part of the Data Science team, you will play a crucial role in developing new product features and automating existing internal processes to drive innovation across Datavant. You will work with tens of millions of patients' worth of healthcare data to develop models, contributing to the entirety of the model development lifecycle from ideation and research to deployment and monitoring. You will collaborate with an experienced team of Data Scientists and Machine Learning Engineers along with application Engineers and Product Managers across the company to achieve Datavant's AI-enabled future. **You Will:** + Play a key role in the success of our products by developing models for NLP (and other) tasks. + Perform error analysis, data cleaning, and other related tasks to improve models. + Collaborate with your team by making recommendations for the development roadmap of a capability. + Work with other data scientists and engineers to optimize machine learning models and insert them into end-to-end pipelines. + Understand product use-cases and define key performance metrics for models according to business requirements. + Set up systems for long-term improvement of models and data quality (e.g. active learning, continuous learning systems, etc.). **What You Will Bring to the Table:** + Advanced degree in computer science, data science, statistics, or a related field, or equivalent work experience. + 4+ years of experience with data science and machine learning in an industry setting. + 4+ years experience with Python. + Experience designing and building NLP models for tasks such as classification, named-entity recognition, and dependency parsing. + Proficiency with standard data analysis toolkits such as SQL, Numpy, Pandas, etc. + Proficiency with deep learning frameworks like PyTorch (preferred) or TensorFlow. + Demonstrated ability to drive results in a team environment and contribute to team decision-making in the face of ambiguity. + Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines. + Initiative and ability to independently explore and research novel topics and concepts as they arise. \#LI-BC1 We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services. The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job. The estimated total cash compensation range for this role is: $136,000-$170,000 USD To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion. This job is not eligible for employment sponsorship. Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay. At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way. Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis. For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
    $136k-170k yearly 27d ago
  • Data Scientist/Architect

    Genoa Employment Solutions 4.8company rating

    Data engineer job in Beaverton, OR

    Designs, develops and programs methods, processes, and systems to consolidate and analyze structured/unstructured, diverse “big data” sources to generate actionable insights and solutions for client services and product enhancement. Builds "products" for Analysis. Interacts with product and service teams to identify questions and issues for data analysis and experiments. Develops and codes software programs, algorithms and automated processes to cleanse, integrate and evaluate large datasets from multiple disparate sources. Identifies meaningful insights from large data and metadata sources; interprets and communicates insights and findings from analysis and experiments to product, service, and business managers. Lead to the accomplishment of key goals across consumer and commercial analytics functions. Work with key stakeholders to understand requirements, develop sustainable data solutions, and provide insights and recommendations. Document and communicate systems and analytics changes to the business, translating complex functionality into business relevant language. Validate key performance indicators and build queries to quantitatively measure business performance. Communicate with cross-functional teams to understand the business cause of data anomalies and outliers. Develop data governance standards from data ingestion to product dictionaries and documentation. Develop SQL queries and data visualizations to fulfill ad-hoc analysis requests and ongoing reporting needs leveraging standard query syntax. Organize and transform information into comprehensible structures. Use data to predict trends and perform statistical analysis. Use data mining to extract information from data sets and identify correlations and patterns. Monitor data quality and remove corrupt data. Evaluate and utilize new technologies, tools, and frameworks centered around high-volume data processing. Improve existing processes through automation and efficient workflows. Build and deliver scalable data and analytics solutions. Work independently and take initiative to identify, explore and solve problems. Design and build innovative data and analytics solutions to support key decisions Support standard methodologies in reporting and analysis, such as, data integrity, unit testing, data quality control, system integration testing, modeling, validation, and documentation. Independently support end-to-end analysis to advise product strategy, data architecture and reporting decisions.
    $85k-120k yearly est. 60d+ ago
  • Data Scientist

    Eyecarecenterofsalem

    Data engineer job in Portland, OR

    Job DescriptionWe are looking for a Data Scientist to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights. In this role, you should be highly analytical with a knack for analysis, math, and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine learning and research. Your goal will be to help our company analyze trends to make better decisions. Responsibilities Identify valuable data sources and automate collection processes Undertake to preprocess of structured and unstructured data Analyze large amounts of information to discover trends and patterns Build predictive models and machine-learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Propose solutions and strategies to business challenges Collaborate with engineering and product development teams Requirements and skills Proven experience as a Data Scientist or Data Analyst Experience in data mining Understanding of machine learning and operations research Knowledge of R, SQL, and Python; familiarity with Scala, Java, or C++ is an asset Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop) Analytical mind and business acumen Strong math skills (e.g. statistics, algebra) Problem-solving aptitude Excellent communication and presentation skills BSc/BA in Computer Science, Engineering, or relevant field; a graduate degree in Data Science or other quantitative field is preferred
    $73k-104k yearly est. 20d ago
  • Data Engineer

    Panthalassa

    Data engineer job in Portland, OR

    About the Company We are a renewable energy and ocean technology company committed to rapidly developing and deploying technologies that will ensure a sustainable future for Earth by unlocking the vast energy potential of its oceans. Our focus is on capturing civilizational levels of ultra-low-cost renewable energy for applications including computing and affordable renewable fuels delivered to shore. The company is a public benefit corporation headquartered in Portland, Oregon, and backed by leading venture capitalists, philanthropic investors, university endowments, and private investment offices. We operate as an idea meritocracy in which the best ideas change the company's direction on a regular basis. About the Job We are developing a core technology that will operate in the most extreme marine environments for years at a time without human maintenance or intervention. We are seeking a Data Engineer with strong software development skills to join our team developing next-generation ocean energy systems. You will work at the intersection of data analysis, simulation, and engineering-supporting the development of clean energy technologies designed to operate in some of the world's most challenging marine environments. In this role, you'll help build and maintain data analysis and simulation pipelines, support R&D with tools to process and interpret engineering datasets, and contribute to internal software used by cross-functional teams. You'll work closely with senior engineers and simulation experts, gaining exposure to real-world physics problems, computational tools, and large-scale scientific workflows. This is an opportunity for an early-career engineer or developer who's excited to contribute to a high-impact mission, write clean and maintainable code, and grow alongside experienced technical mentors. Our staff have worked at organizations such as SpaceX, Blue Origin, Boeing, Tesla, Apple, Virgin Orbit, Google, Amazon, Microsoft, New Relic, Bridgewater, Raytheon, Disney Imagineering, and the US Army and Air Force, as well as research universities, startups, and small companies across a range of industries. We are organized as a public benefit corporation and are backed by leading venture capital firms, private investors, philanthropic investors, and endowments. We strive to be the best engineering team on the planet and we compensate our team members accordingly. Responsibilities Develop and maintain data analysis tools to support engineering design, simulation, and testing workflows Clean, process, and analyze large datasets from CFD simulations, experiments, and field deployments Collaborate with senior engineers to extract insights from simulation results and translate them into actionable design feedback Write modular, well-documented code in Python to automate repetitive or computational workflows Assist in the development of internal software used for simulation pipeline orchestration and post-processing Support post-processing of CFD results using tools such as OpenFOAM, Star-CCM+, and ParaView Work with HPC or cloud compute environments to run and manage large simulation or data processing jobs Contribute to the development of internal documentation, best practices, and reusable analysis scripts Participate in code reviews, collaborative debugging sessions, and weekly team check-ins to share findings and progress Continuously learn new tools, frameworks, and domain-specific knowledge to grow within a fast-paced R&D team Required Qualifications Legal authorization to work in the United States. Bachelor's or Master's degree in Computer Science, Data Science, Engineering, Physics, Applied Mathematics, or a related field Proficiency in Python and familiarity with key data analysis libraries (e.g., NumPy, pandas, matplotlib) Experience writing clean, well-structured code for scientific or engineering problems Familiarity with software development best practices including version control (e.g., Git) and modular code design Ability to interpret and work with structured datasets from simulations or experiments Strong analytical and problem-solving skills with attention to detail Excellent collaboration and communication skills within technical teams Self-motivated with a desire to learn and take ownership of tasks in a fast-paced environment Preferred Qualifications Experience with CFD tools such as OpenFOAM, Star-CCM+, or ParaView for post-processing Exposure to scientific computing workflows including simulation automation or batch processing Familiarity with HPC environments, Linux-based systems, and bash or Python scripting for automation Understanding of fluid dynamics, ocean engineering, or physics-based modeling Experience building or contributing to internal software tools for data analysis, simulation, or visualization Academic or internship experience involving simulation data pipelines or engineering R&D projects The above qualifications are desired, not required. We encourage you to apply if you are a strong candidate with only some of the desired skills and experience listed. Additional Requirements Occasional extended hours or weekend work to support key milestones. Strong preference for candidates based in Portland, OR. Exceptional remote candidates will be considered. Compensation and Benefits If hired for this full-time role, you will receive: Cash compensation of $110,000-$175,000. Equity in the company. We're all owners and if we're successful, this equity should be far and away the most valuable component of your compensation. A benefits package that helps you take care of yourself and your family, including: Flexible paid time off Health insurance (the company pays 100% of gold level PPO plan for full time employees, their partners, and dependents) Dental insurance (the company pays 33% for full time employees and 100% for their partners and dependents) Vision insurance (the company pays 100% for full time employees, their partners, and dependents) Disability insurance (the company pays 100% for a policy to provide long term financial support if you become disabled) Ability to contribute to tax-advantaged accounts, including 401(k), health FSA, and dependent care FSA Relocation assistance to facilitate your move to Portland (if needed). Location We have a strong preference for candidates based in Portland, OR as this is an in-office role. Our offices, lab and shop, are located in Portland, Oregon.
    $110k-175k yearly Auto-Apply 60d+ ago
  • Need Sr Big Data Engineer at Beaverton, OR Only W2

    USM 4.2company rating

    Data engineer job in Beaverton, OR

    Hi, We have immediate opportunity with our direct client send your resume asap upon your interest. Thank you. Sr Big Data Engineer Duration: Long Term Skills Typical Office: This is a typical office job, with no special physical requirements or unusual work environment. Core responsibilities: Expert data engineers will work with product teams across client to help automate and integrate various of data domains with a wide variety of data profiles (different scale, cadence, and volatility) into client's next-gen data and analytics platform. This is an opportunity to work across multiple subject areas and source platforms to ingest, organize, and prepare data through cloud-native processes. Required skills/experience: - 5+ years of professional development experience between either Python (preferred) or Scala/Java; familiarity with both is ideal - 5+ years of data-centric development with a focus on efficient data access and manipulation at multiple scales - 3+ years of experience with the HDFS ecosystem of tools (any distro, Spark experience prioritized) - 3+ years of significant experience developing within the broader AWS ecosystem of platforms and services - 3+ years of experience optimizing data access and analysis in non-HDFS data platforms (Traditional RDMS's, NoSQL / KV stores, etc) - Direct task development and/or configuration experience with a remote workflow orchestration tool - Airflow (preferred), Amazon Data Pipeline, Luigi, Oozie, etc. - Intelligence, strong problem-solving ability, and the ability to effectively communicate to partners with a broad spectrum of experiential backgrounds Several of the following skills are also desired: - A demonstrably strong understanding of security and credential management between application / platform components - A demonstrably strong understanding of core considerations when working with data at scale, in both file-based and database contexts, including SQL optimization - Direct experience with Netflix Genie is another huge plus - Prior experience with the operational backbone of a CI/CD environment (pipeline orchestration + configuration management) is useful - Clean coding practices, passion for development, being a generally good team player, etc, etc (and experience with GitHub) is always nice Keys to Success: - Deliver exceptional customer service - Demonstrate accountability and integrity - Be willing to learn and adapt every day - Embrace change Skills Regards Nithya Additional Information All your information will be kept confidential according to EEO guidelines. please send the profiles to ************************* and contact No# ************.
    $93k-132k yearly est. Easy Apply 60d+ ago
  • Data Engineer

    Insight Global

    Data engineer job in Vancouver, WA

    A client in the Vancouver, Wa area is looking for a Data Engineer to join their team. This position will be full time/direct hire and will be 5 days on-site, therefore we are looking for someone who enjoys being onsite in a collaborative environment. As a Data Engineer, you will be working with the Data Team and be responsible for Data Engineering as well as Data Operations and maintaining what has been engineered. In this position the tools and technologies you will be working with include, but are not limited to: Python, SQL, ETL processes, APIs (not building but calling APIs), SQL development, working with integrations of 3rd party systems, and assisting on a project to build out a data lake to organize their data. In this role this team is doing data migrations from different systems, 3rd party integrations, building a data lake as data is currently coming from different sources and they want to streamline the process. If this sounds like a role of interest, please apply today! Thank you. We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ********************.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: **************************************************** Skills and Requirements 2-3 years of professional experience with SQL Server development, 30% of your time will be spent in SQL which includes: writing stored procedures, CTE, building indexes, running queries, and reading queries 3+ years of experience working with ETL; building tables 3+ years of experience with Python writing scripts 3-4+ years of experience with Excel and Power BI supporting the infrastructure for the data structure, not building reports, but supporting data that is performant and correct Experience with APIs Experience with AI: Copilot, Claude code, etc. Azure experience Familiar with Star Schema or Snowflake Schema NetSuite, HubSpot, Pento, Salesforce,3rd party integrations
    $91k-128k yearly est. 14d ago
  • Data Warehouse Developer

    Banfield Pet Hospital 3.8company rating

    Data engineer job in Vancouver, WA

    This position requires an onsite presence at the Banfield Pet Hospital headquarters in Vancouver, Washington, with a hybrid work schedule (3 days/week). Summary and Qualifications: The Data Warehouse Developer contributes at the enterprise level to the development of designing, building, testing, and maintaining data products and applications. They will work with programming languages and data tools and technologies to create data consumption solutions for internal end users. They will collaborate with the Data Architects and leads of the team to implement and maintain data products under the direction of the Architects/Leads. They will work within the development team in collaboration with internal, external development teams, and product managers. Essential Responsibilities and Tasks: + Live and exemplify the Five Principles of Mars, Inc. within self and team. + Spend 80% of time in active development tasks within the designated data platform primarily on Azure/Cloud. + Apply modern development practices in data management, configuration, development, and extension of the designated platform within the Mars Veterinary Health (MVH) and Banfield environment. + Analyze user needs and translate them into software specifications, converting business requirements into stories and work items for the platform backlog. + Execute tasks from the team backlog. + Work independently with support from the Architect/Lead to enable systems integrations. + Write and implement clean, scalable code. + Test and deploy quality applications, regularly assessing for improvements. + Develop technical documentation to support future software development projects + Recommend and execute program improvements. + Other job duties as assigned. Special Working Conditions: + Ability to work at a computer for long periods of time. + Must have mental processes for reasoning, remembering, mathematics, and language ability (reading, writing, and speaking the English language) to perform the duties proficiently. + Ability to carry out instructions furnished in written, oral, or diagram form and to solve problems involving several variables. + Ability to stand, walk, stoop, kneel, crouch, and climb as well as manipulate (lift, carry, move) up to 25 pounds. + Requires good hand-eye coordination, arm-hand-finger dexterity with the ability to grasp, and visual acuity to use a keyboard and operate necessary equipment. + The noise level in the work environment is normally moderate. + Environment where pets are present. + The physical demands and work environment characteristics described here are representative of those that must be met by an associate to successfully perform the essential functions of this position. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Experience, Education and/or Training: + Bachelor's degree in Computer Science or a related field is required, or the equivalent combination of education, training and experience that provides the required knowledge, skills, and abilities. + A minimum of five years of data warehousing and analytics experience are required. + Five years of experience working directly with data warehousing and analytical tools in Azure and Databricks are required. + Demonstrated proficiency and experience with data warehouse implementation methodologies (Kimball/Inmon), scripting for data processing (Python/Spark), Azure Data Bricks for data processing and engineering are required. + Understanding of data warehousing/engineering principles and patterns is required. + Strong analytical and problem-solving skills are required. + Effective written and verbal communication skills to collaborate with colleagues and team members are required. + Knowledge data warehousing development lifecycle is required. + Willingness to learn and adapt to new technologies and methodologies is required. + Azure Data Engineer and/or Associate Data Engineer Databricks certification is preferred. + Experience in Oracle/Teredata/Netezza data warehouse is preferred. + Knowledge of object-relational mapping frameworks is preferred. + Experience with Agile and Scrum development methodologies is preferred. Salary Range: The pay range for this role is $93,707 - $128,847. The pay range listed reflects a general hiring range for the area, with the specific rate determined based on the candidate's experience, skill level, abilities, and education, and may vary depending on location and schedule. This posting will remain open for a minimum of two weeks or until a sufficient pool of qualified applicants has been received. Benefits: Here at Banfield, we prioritize your well-being and professional growth by offering a comprehensive total rewards package, including health, wellness, and financial support for you, your family, and even your pets. Check out some of our "Meow-velous" benefits: + Comprehensive Medical, Dental, and Vision Insurance: Enjoy peace of mind knowing your health and wellness are our top priorities. We've got your essential medical, dental, and vision care covered. + Generous Retirement Plans (401(k) and Roth): Invest in your future and enjoy a generous company match to help you build a secure financial future.* + Paid Time Off and Holidays: Take a break, recharge your wellbeing, and celebrate days of personal significance with paid time off and holidays.* + Top-Tier Mental Health and Wellbeing Resources: Your mental health matters. Access our industry-leading resources, including free coaching and counseling sessions, to support your overall wellbeing and help you thrive.* + Associate Life Insurance (company-paid) & Supplemental Life Insurance: Protect your loved ones with our company-paid Associate life insurance and have the option to purchase additional coverage for extra peace of mind. + Company-Paid Short- and Long-Term Disability: Feel secure knowing that if you face a temporary or long-term disability, you'll have financial protection. + Flexible Spending Accounts (FSA): Save on healthcare and dependent care expenses by setting aside pre-tax money. It's a smart way to manage your budget and take care of your needs. + Health Savings Account (HSA): Make the most of your healthcare dollars with a tax-advantaged HSA, allowing you to pay for medical expenses with pre-tax funds. + Paid Parental Leave: We support growing families with paid parental leave for both birth and adoption, giving you precious time to bond with your new family addition. + Continuing Education Allowance (for Eligible Positions): Banfield is committed to supporting the professional growth of our Associates. This allowance provides financial assistance to pursue continuing education opportunities.* + Back-Up Child and Elder Care & Family Support Resources: When life's unpredictable moments arise, our backup care and family support benefits provide the help you need to keep things running smoothly.* + Fertility and Family Building Support: We're here for you on your journey to parenthood, offering comprehensive support for fertility treatments and family-building options. + Digital Exercise Therapy: Stay active and healthy with our digital exercise therapy program, designed to fit your busy lifestyle, and keep you moving. + Voluntary Protection Benefits: Get peace of mind with protection against the unexpected. You can purchase coverage to help support you financially during hospital stays, critical illness, and accidents.* + Legal Plan: Gain extra peace of mind with our affordable and accessible legal plan which includes coverage for a wide range of legal needs.* + Identity Protection: Identity Protection helps safeguard your personal information by alerting you to suspicious activity and providing support if your information is stolen.* + Commuter Benefits: Say goodbye to commuting stress with our commuter benefits, making your daily journey more convenient and cost-effective.* + Three Free Optimum Wellness Plans for Pets: We care about your furry friends too! Enjoy three free wellness plans to ensure your pets receive the best preventive and general care.* + Exclusive Discounts: Unlock a world of savings with our wide variety of exclusive discounts on products and services, making life more affordable and enjoyable.* Benefits eligibility is based on employment status. Full-time (FT) Associates are eligible for all benefit programs; Part-time Associates are eligible for those benefits with an asterisk (*). WE ARE A DRUG-FREE, SMOKE-FREE, EQUAL OPPORTUNITY EMPLOYER. Banfield Pet Hospital strongly supports and values the uniqueness of all individuals and promotes a work environment where diversity is embraced. Banfield Pet Hospital is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, sex, sexual orientation, gender identity, age, genetic information, status as a protected veteran, or status as a qualified individual with disability. Banfield Pet Hospital complies with all applicable federal, state and local laws governing nondiscrimination in employment in every Banfield location. #FT
    $93.7k-128.8k yearly 35d ago
  • Data Engineer

    Accenture 4.7company rating

    Data engineer job in Beaverton, OR

    Accenture Flex offers you the flexibility of local fixed-duration project-based work powered by Accenture, a leading global professional services company. Accenture is consistently recognized on FORTUNE's 100 Best Companies to Work For and Diversity Inc's Top 50 Companies For Diversity lists. As an Accenture Flex employee, you will apply your skills and experience to help drive business transformation for leading organizations and communities. In addition to delivering innovative solutions for Accenture's clients, you will work with a highly skilled, diverse network of people across Accenture businesses who are using the latest emerging technologies to address today's biggest business challenges. You will receive competitive rewards and access to benefits programs and world-class learning resources. Accenture Flex employees work in their local metro area onsite at the project, significantly reducing and/or eliminating the demands to travel. Key Responsibilities: + Utilize strong SQL & Python skills to engineer sound data pipelines and conduct routine and ad hoc analysis to assess the performance of legacy products and the saliency of new features. + Build reporting dashboards and visualizations to design, create and track campaign/program KPIs. + Perform analyses on large data sets to understand drivers of operational efficiency. + Manage end-to-end process of analytic tooling feature development, including request intake, requirements evaluation, cross-functional team alignment, feature execution, QA testing, and stakeholder communication. + Consult with business analysts, technical data SMEs, and cross-functional partners to understand their reporting and data needs, serving as the point of contact for requests, inquiries, and actions Interface with other data engineering, product, and data science teams to implement client needs and initiatives. *Location: Remote - Bay Area Preferred Basic Qualifications: + Minimum 5 years of experience as a Data Engineer + High School Diploma or GED Preferred Qualifications: + Experience with programming languages SQL and Python Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired in California, Colorado, District of Columbia, Illinois, Maryland, Minnesota, New York/New Jersey or Washington as set forth below. We accept applications on an on-going basis and there is no fixed deadline to apply. Information on benefits is here. (************************************************************************************** Role Location Hourly Salary Range California $62.98 to $72.98 Cleveland $62.98 to $72.98 Colorado $62.98 to $72.98 District of Columbia $62.98 to $72.98 Illinois $62.98 to $72.98 Maryland $62.98 to $72.98 Massachusetts $62.98 to $72.98 Minnesota $62.98 to $72.98 New York/New Jersey $62.98 to $72.98 Washington $62.98 to $72.98 Requesting an Accommodation Accenture is committed to providing equal employment opportunities for persons with disabilities or religious observances, including reasonable accommodation when needed. If you are hired by Accenture and require accommodation to perform the essential functions of your role, you will be asked to participate in our reasonable accommodation process. Accommodations made to facilitate the recruiting process are not a guarantee of future or continued accommodations once hired. If you would like to be considered for employment opportunities with Accenture and have accommodation needs such as for a disability or religious observance, please call us toll free at **************** or send us an email or speak with your recruiter. Equal Employment Opportunity Statement We believe that no one should be discriminated against because of their differences. All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. Our rich diversity makes us more innovative, more competitive, and more creative, which helps us better serve our clients and our communities. For details, view a copy of the Accenture Equal Opportunity Statement (******************************************************************************************************************************************** Accenture is an EEO and Affirmative Action Employer of Veterans/Individuals with Disabilities. Accenture is committed to providing veteran employment opportunities to our service men and women. Other Employment Statements Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States. Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration. Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process. Further, at Accenture a criminal conviction history is not an absolute bar to employment. The Company will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. Additionally, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the Company's legal duty to furnish information. California requires additional notifications for applicants and employees. If you are a California resident, live in or plan to work from Los Angeles County upon being hired for this position, please click here for additional important information. Please read Accenture's Recruiting and Hiring Statement for more information on how we process your data during the Recruiting and Hiring process.
    $63-73 hourly 60d+ ago
  • Sr. Data Engineer

    Concoracredit

    Data engineer job in Beaverton, OR

    As a Sr. Data Engineer, you'll help drive Concora Credit's Mission to enable customers to Do More with Credit - every single day. The impact you'll have at Concora Credit: We are seeking a Sr. Data Engineer with deep expertise in Azure and Databricks to lead the design, development, and optimization of scalable data pipelines and platforms. You'll be responsible for building robust data solutions that power analytics, reporting, and machine learning across the organization using Azure cloud services and Databricks. This position is located at our Beaverton, OR office and has a hybrid schedule. We're onsite Monday through Wednesday. We hire people, not positions. That's because, at Concora Credit, we put people first, including our customers, partners, and Team Members. Concora Credit is guided by a single purpose: to help non-prime customers do more with credit. Today, we have helped millions of customers access credit. Our industry leadership, resilience, and willingness to adapt ensure we can help our partners responsibly say yes to millions more. As a company grounded in entrepreneurship, we're looking to expand our team and are looking for people who foster innovation, strive to make an impact, and want to Do More! We're an established company with over 20 years of experience, but now we're taking things to the next level. We're seeking someone who wants to impact the business and play a pivotal role in leading the charge for change. Responsibilities As our Sr. Data Engineer, you will: Design and develop scalable, efficient data pipelines using Azure Databricks Build and manage data ingestion, transformation, and storage solutions leveraging Azure Data Factory, Azure Data Lake, and Delta Lake Implement CI/CD for data workflows using tools like Azure DevOps, Git, and Terraform Optimize performance and cost efficiency across large-scale distributed data systems Collaborate with analysts, data scientists, and business stakeholders to understand data needs and deliver reliable, reusable datasets Provide guidance and mentor junior engineers and actively contribute to data platform best practices Monitor, troubleshoot, and optimize existing pipelines and infrastructure to ensure reliability and scalability These duties must be performed with or without reasonable accommodation. We know experience comes in many forms and that many skills are transferable. If your experience is close to what we're looking for, consider applying. Diversity has made us the entrepreneurial and innovative company that we are today. Qualifications Requirements: 5+ years of experience in data engineering, with a strong focus on Azure cloud technologies Experience with Azure Databricks, Azure Data Lake, Data Factory including PySpark, SQL, Python and Delta Lake Strong proficiency in Databricks and Apache Spark Solid understanding of data warehousing, ETL/ELT, and data modeling best practices Experience with version control, CI/CD pipelines, and infrastructure as code Knowledge of Spark performance tuning, partitioning, and job orchestration Excellent problem-solving skills and attention to detail Strong communication and collaboration abilities across technical and non-technical teams Ability to work independently and lead in a fast-paced, agile environment Passion for delivering clean, high-quality, and maintainable code Preferred Qualifications: Experience with Unity Catalog, Databricks Workflows, and Delta Live Tables Familiarity with DevOps practices or Terraform for Azure resource provisioning Understanding of data security, RBAC, and compliance in cloud environments Experience integrating Databricks with Power BI or other analytics platforms Exposure to real-time data processing using Kafka, Event Hubs, or Structured Streaming What's In It For You: Medical, Dental and Vision insurance for you and your family Relax and recharge with Paid Time Off (PTO) 6 company-observed paid holidays, plus 3 paid floating holidays 401k (after 90 days) plus employer match up to 4% Pet Insurance for your furry family members Wellness perks including onsite fitness equipment at both locations, EAP, and access to the Headspace App We invest in your future through Tuition Reimbursement Save on taxes with Flexible Spending Accounts Peace of mind with Life and AD&D Insurance Protect yourself with company-paid Long-Term Disability and voluntary Short-Term Disability Concora Credit provides equal employment opportunities to all Team Members and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Concora Credit is an equal opportunity employer (EEO). Please see the Concora Credit Privacy Policy for more information on how Concora Credit processes your personal information during the recruitment process and, if applicable, based on your location, how you can exercise your privacy rights. If you have questions about this privacy notice or need to contact us in connection with your personal data, including any requests to exercise your legal rights referred to at the end of this notice, please contact caprivacynotice@concoracredit.com.
    $84k-118k yearly est. Auto-Apply 36d ago
  • Sr Data Engineer

    Moda Health 4.5company rating

    Data engineer job in Portland, OR

    Job Description Let's do great things, together! About Moda Founded in Oregon in 1955, Moda is proud to be a company of real people committed to quality. Today, like then, we're focused on building a better future for healthcare. That starts by offering outstanding coverage to our members, compassionate support to our community and comprehensive benefits to our employees. It keeps going by connecting with neighbors to create healthy spaces and places, together. Moda values diversity and inclusion in our workplace. We aim to demonstrate our commitment to diversity through all our business practices and invite applications from candidates that share our commitment to this diversity. Our diverse experiences and perspectives help us become a stronger organization. Let's be better together. Position Summary The Senior Data Engineer on the Data Science team builds and automates data pipelines, implements quality controls, and contributes to maintaining existing processes associated with the current Data Warehouse solution. This is a FT WFH position. Pay Range $92,940.40 - $140,000.00 annually (depending on experience). *This role may be classified as hourly (non-exempt) depending on the applicant's location. Actual pay is based on qualifications. Applicants who do not exceed the minimum qualifications will only be eligible for the low end of the pay range. Please fill out an application on our company page, linked below, to be considered for this position. ************************** GK=27763372&refresh=true Benefits: Medical, Dental, Vision, Pharmacy, Life, & Disability 401K- Matching FSA Employee Assistance Program PTO and Company Paid Holidays Required Skills, Experience & Education: 5 - 7 years of experience working with large data sets using relational databases and/or data analysis tools (SQL Server, Oracle, Snowflake, MS Fabric, SAS Enterprise Guide), with experience in tools like Power BI or other Business Intelligence development a plus. 5 - 7 years of regular SQL use, with advanced knowledge of query optimization and performance management across on-prem and cloud solutions (SQL Server, Snowflake, MS Fabric) preferred. 5 - 7 years of experience with data transformation tools such as SQL stored procedures, dbt, Coalesce, or similar. Experience with both on-prem SQL servers and cloud data warehouses (SQL Server, Oracle, Snowflake, MS Fabric, BigQuery); experience with on-prem database server to cloud data warehouse migrations a plus. Experience with data pipeline orchestration tools such as SSIS, SQL Agent, Tidal, Airflow, and dbt. Experience working with healthcare data strongly preferred. Advanced knowledge of data management, relational data structures, and data modeling. Demonstrated ability to identify opportunities, lead development, and implement data-driven solutions that solve real business problems. Working knowledge of Agile methodology. Experience with programming languages such as Python, R, or Java is a plus. 3+ years of experience working with DevOps such as Azure DevOps is nice to have. Primary Functions: Design, develop, and test ETL code to ingest data from various sources into the centralized data warehouse. Automate, maintain, and scale ETL processes using a diverse set of tools, with flexibility for future enhancements. Support data platform modernization and execute migrations to new cloud-based infrastructure. Develop processes and systems to ensure data quality and accuracy. Build enterprise-grade modern data pipelines to encourage organizational adoption of next-generation data architecture strategies. Work closely with other teams to understand their requirements and support their data infrastructure needs. Support the Data Science team in implementing data solutions for complex machine learning problems. Participate collaboratively in Enterprise Data Warehousing, Business Intelligence, and other Data Management projects. Perform other duties as assigned. Working Conditions & Contact with Others Office environment with extensive close PC and keyboard use, constant sitting, and frequent phone communication. Must be able to navigate multiple computer screens. A reliable, high-speed, hard-wired internet connection required to support remote or hybrid work. Must be comfortable being on camera for virtual training and meetings. Work in excess of standard workweek, including evenings and occasional weekends, to meet business need. Internally with Data Science, IT, and various data consumers and stakeholders. Externally with third-party providers of data and business solutions, data consumers, and various stakeholders. Together, we can be more. We can be better. Moda Health seeks to allow equal employment opportunities for all qualified persons without regard to race, religion, color, age, sex, sexual orientation, national origin, marital status, disability, veteran status or any other status protected by law. This is applicable to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absences, compensation, and training. For more information regarding accommodations, please direct your questions to Kristy Nehler & Danielle Baker via our ***************************** email.
    $92.9k-140k yearly Easy Apply 10d ago
  • Sr.Hadoop Developer

    Bridge Tech 4.2company rating

    Data engineer job in Beaverton, OR

    Job DescriptionTypically requires a Bachelors Degree and minimum of 5 years directly relevant work experience Client is embarking on the big data platform in Consumer Digital using Hadoop Distributed File System cluster. As a Sr. Hadoop developer you will work with a variety of talented client team mates and be a driving force for building solutions. You will be working on development projects related to commerce and web analytics. Responsibilities: •Design and implement map reduce jobs to support distributed processing using java, cascading, python, hive and pig; Ability to design and implement end to end solution. •Build libraries, user defined functions, and frameworks around Hadoop •Research, evaluate and utilize new technologies/tools/frameworks around Hadoop eco system •Develop user defined functions to provide custom hive and pig capabilities •Define and build data acquisitions and consumption strategies •Define & develop best practices •Work with support teams in resolving operational & performance issues •Work with architecture/engineering leads and other teams on capacity planning QualificationsQualification: •MS/BS degree in a computer science field or related discipline •6+ years' experience in large-scale software development •1+ year experience in Hadoop •Strong Java programming, shell scripting, Python, and SQL •Strong development skills around Hadoop, MapReduce, Hive, Pig, Impala •Strong understanding of Hadoop internals •Good understanding of AVRO and Json and other compresssion •Experience with build tools such as Maven •Experience with databases like Oracle; •Experience with performance/scalability tuning, algorithms and computational complexity •Experience (at least familiarity) with data warehousing, dimensional modeling and ETL development •Ability to understand and ERDs and relational database schemas •Proven ability to work cross functional teams to deliver appropriate resolution Nice to have: •Experience with open source NOSQL technologies such as HBase and Cassandra •Experience with messaging & complex event processing systems such as Kafka and Storm •Machine learning framework •Statistical analysis with Python, R or similar Additional Information All your information will be kept confidential according to EEO guidelines.
    $90k-118k yearly est. 60d+ ago
  • Data Architect

    Maximus 4.3company rating

    Data engineer job in Portland, OR

    Description & Requirements Maximus is looking for an experienced Data Architect to lead the design and implementation of modern, scalable DevOps solutions. In this role, you'll drive automation, containerization, and continuous delivery practices across enterprise systems. If you're a technical expert with a passion for innovation, collaboration, and building high-performing environments, join us and help shape the future of digital transformation. ***This is a fully remote position. Requires 10% travel. 100% mileage reimbursed at federal rate*** Why Join Maximus? - • Competitive Compensation - Quarterly bonuses based on performance included! - • Comprehensive Insurance Coverage - Choose from various plans, including Medical, Dental, Vision, Prescription, and partially funded HSA. Additionally, enjoy Life insurance benefits and discounts on Auto, Home, Renter's, and Pet insurance. - • Future Planning - Prepare for retirement with our 401K Retirement Savings plan and Company Matching. - • Unlimited Time Off Package - Enjoy UTO, Holidays, and sick leave, - • Holistic Wellness Support - Access resources for physical, emotional, and financial wellness through our Employee Assistance Program (EAP). - • Recognition Platform - Acknowledge and appreciate outstanding employee contributions. - • Tuition Reimbursement - Invest in your ongoing education and development. - • Employee Perks and Discounts - Additional benefits and discounts exclusively for employees. - • Maximus Wellness Program and Resources - Access a range of wellness programs and resources tailored to your needs. - • Professional Development Opportunities- Participate in training programs, workshops, and conferences. Essential Duties and Responsibilities: - Define, develop, and implement the configuration management system which supports the enterprise software development life cycle (SDLC). - Manage source code within the Version Control System (branching, sync, merge, etc.), compile, assemble, and package software from source code; mentor less senior team members in this discipline. - Work with client to perform and validate installations, upgrades, deployments, and containers. - Define and provide guidance on standards and best practices. - Develop automation scripts for build, deployment, and versioning activities; mentor less senior team members in this discipline. - Research and resolve technical problems associated with the version control and continuous integration systems. - Typically responsible for providing guidance, coaching, and training to other employees within job area. Minimum Requirements - Bachelor's degree in relevant field of study and 7+ years of relevant professional experience required, or equivalent combination of education and experience. -Database management experience preferred - M.M.I.S. experience preferred - Data conversion experience preferred -Technical leadership experience preferred -Technical oversight experience preferred #HumanServices #LI-Remote EEO Statement Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics. Pay Transparency Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances. Accommodations Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at applicantaccommodations@maximus.com. Minimum Salary $ 150,000.00 Maximum Salary $ 175,000.00
    $98k-138k yearly est. 4d ago
  • Sr. Data Engineer

    It Vision Group

    Data engineer job in Portland, OR

    Job Description Title : Sr. Data Engineer Duration: 12 Months+ Roles & Responsibilities Perform data analysis according to business needs Translate functional business requirements into high-level and low-level technical designs Design and implement distributed data processing pipelines using Apache Spark, Apache Hive, Python, and other tools and languages prevalent in a modern analytics platform Create and schedule workflows using Apache Airflow or similar job orchestration tooling Build utilities, functions, and frameworks to better enable high-volume data processing Define and build data acquisitions and consumption strategies Build and incorporate automated unit tests, participate in integration testing efforts Work with teams to resolve operational & performance issues Work with architecture/engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and followed. Tech Stack Apache Spark Apache Spark Streaming using Apache Kafka Apache Hive Apache Airflow Python AWS EMR and S3 Snowflake SQL Other Tools & Technologies :: PyCharm, Jenkin, Github. Apache Nifi (Optional) Scala (Optional)
    $84k-118k yearly est. 16d ago
  • BigData Engineer / Architect

    Nitor Infotech

    Data engineer job in Portland, OR

    The hunt is for a strong Big Data Professional, a team player with the ability to manage effective relationships with a wide range of stakeholders (customers & team members alike). Incumbent will demonstrate personal commitment and accountability to ensure standards are continuously sustained and improved both within the internal teams, and with partner organizations and suppliers. Role: Big Data Engineer Location: Portland OR. Duration: Full Time Skill Matrix: Map Reduce - Required Apache Spark - Required Informatica PowerCenter - Required Hive - Required Apache Hadoop - Required Core Java / Python - Highly Desired Healthcare Domain Experience - Highly Desired Job Description Responsibilities and Duties Participate in technical planning & requirements gathering phases including architectural design, coding, testing, troubleshooting, and documenting big data-oriented software applications. Responsible for the ingestion, maintenance, improvement, cleaning, and manipulation of data in the business's operational and analytics databases, and troubleshoots any existent issues. Implementation, troubleshooting, and optimization distributed solutions based on modern big data technologies like Hive, Hadoop, Spark, Elastic Search, Storm, Kafka, etc. in both an on premise and cloud deployment model to solve large scale processing problems Design, enhance and implement ETL/data ingestion platform on the cloud. Strong Data Warehousing skills, including: Data clean-up, ETL, ELT and handling scalability issues for enterprise level data warehouse Capable of investigating, familiarizing and mastering new data sets quickly Strong troubleshooting and problem-solving skills in large data environment Experience with building data platform on cloud (AWS or Azure) Experience in using Python, Java or any other language to solving data problems Experience in implementing SDLC best practices and Agile methods. Qualifications Required Skills: Data architecture/ Big Data/ ETL environment Experience with ETL design using tools Informatica, Talend, Oracle Data Integrator (ODI), Dell Boomi or equivalent Big Data & Analytics solutions Hadoop, Pig, Hive, Spark, Spark SQL Storm, AWS (EMR, Redshift, S3, etc.)/Azure (HDInsight, Data Lake Design) Building and managing hosted big data architecture, toolkit familiarity in: Hadoop with Oozy, Sqoop, Pig, Hive, HBase, Avro, Parquet, Spark, NiFi Foundational data management concepts - RDM and MDM Experience in working with JIRA/Git/Bitbucket/JUNIT and other code management toolsets Strong hands-on knowledge of/using solutioning languages like: Java, Scala, Python - any one is fine Healthcare Domain knowledge Required Experience, Skills and Qualifications Qualifications: Bachelor's Degree with a minimum of 6 to 9 + year's relevant experience or equivalent. Extensive experience in data architecture/Big Data/ ETL environment. Additional Information All your information will be kept confidential according to EEO guidelines.
    $84k-118k yearly est. 8h ago
  • Senior Data Engineer

    Advance Local 3.6company rating

    Data engineer job in Portland, OR

    **Advance Local** is looking for a **Senior Data Engineer** to design, build, and maintain the enterprise data infrastructure that powers our cloud data platform. This position will combine your deep technical expertise in data engineering with team leadership responsibilities for data engineering, overseeing the ingestion, integration, and reliability of data systems across Snowflake, AWS, Google Cloud, and legacy platforms. You'll partner with data product and across business units to translate requirements into technical solutions, integrate data from numerous third-party platforms, (CDPs, DMPs, analytics platforms, marketing tech) into central data platform, collaborate closely with the Data Architect on platform strategy and ensure scalable, well-engineered solutions for modern data infrastructure using infrastructure as code and API-driven integrations. The base salary range is $120,000 - $140,000 per year. **What you'll be doing:** + Lead the design and implementation of scalable data ingestion pipelines from diverse sources into Snowflake. + Partner with platform owners across business units to establish and maintain data integrations from third party systems into the central data platform. + Architect and maintain data infrastructure using IAC, ensuring reproducibility, version control and disaster recovery capabilities. + Design and implement API integrations and event-driven data flows to support real time and batch data requirements. + Evaluate technical capabilities and integration patterns of existing and potential third-party platforms, advising on platform consolidation and optimization opportunities. + Partner with the Data Architect and data product to define the overall data platform strategy, ensuring alignment between raw data ingestion and analytics-ready data products that serve business unit needs. + Develop and enforce data engineering best practices including testing frameworks, deployment automation, and observability. + Support rapid prototyping of new data products in collaboration with data product by building flexible, reusable data infrastructure components. + Design, develop, and maintain scalable data pipelines and ETL processes; optimize and improve existing data systems for performance, cost efficiency, and scalability. + Collaborate with data product, third-party platform owners, Data Architects, Analytics Engineers, Data Scientists, and business stakeholders to understand data requirements and deliver technical solutions that enable business outcomes across the organization. + Implement data quality validation, monitoring, and alerting systems to ensure reliability of data pipelines from all sources. + Develop and maintain comprehensive documentation for data engineering processes and systems, architecture, integration patterns, and runbooks. + Lead incident response and troubleshooting efforts for data pipeline issues, ensuring minimal business impact. + Stay current with the emerging data engineering technologies, cloud services, SaaS platform capabilities, and industry best practices. **Our ideal candidate will have the following:** + Bachelor's degree in computer science, engineering, or a related field + Minimum of seven years of experience in data engineering with at least two years in a lead or senior technical role + Expert proficiency in Snowflake data engineering patterns + Strong experience with AWS services (S3, Lambda, Glue, Step Functions) and Google Cloud Platform + Experience integrating data from SaaS platforms and marketing technology stacks (CDPs, DMPs, analytics platforms, CRMs) + Proven ability to work with third party APIs, webhooks, and data exports + Experience with infrastructure such as code (Terraform, Cloud Formation) and CI/CD pipelines for data infrastructure + Proven ability to design and implement API integrations and event-driven architecture + Experience with data modeling, data warehousing, and ETL processes at scale + Advanced proficiency in Python and SQL for data pipeline development + Experience with data orchestration tools (airflow, dbt, Snowflake tasks) + Strong understanding of data security, access controls, and compliance requirements + Ability to navigate vendor relationships and evaluate technical capabilities of third-party platforms + Excellent problem-solving skills and attention to detail + Strong communication and collaboraion skills **Additional Information** Advance Local Media offers competitive pay and a comprehensive benefits package with affordable options for your healthcare including medical, dental and vision plans, mental health support options, flexible spending accounts, fertility assistance, a competitive 401(k) plan to help plan for your future, generous paid time off, paid parental and caregiver leave and an employee assistance program to support your work/life balance, optional legal assistance, life insurance options, as well as flexible holidays to honor cultural diversity. Advance Local Media is one of the largest media groups in the United States, which operates the leading news and information companies in more than 20 cities, reaching 52+ million people monthly with our quality, real-time journalism and community engagement. Our company is built upon the values of Integrity, Customer-first, Inclusiveness, Collaboration and Forward-looking. For more information about Advance Local, please visit ******************** . Advance Local Media includes MLive Media Group, Advance Ohio, Alabama Media Group, NJ Advance Media, Advance Media NY, MassLive Media, Oregonian Media Group, Staten Island Media Group, PA Media Group, ZeroSum, Headline Group, Adpearance, Advance Aviation, Advance Healthcare, Advance Education, Advance National Solutions, Advance Originals, Advance Recruitment, Advance Travel & Tourism, BookingsCloud, Cloud Theory, Fox Dealer, Hoot Interactive, Search Optics, Subtext. _Advance Local Media is proud to be an equal opportunity employer, encouraging applications from people of all backgrounds. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, genetic information, national origin, age, disability, sexual orientation, marital status, veteran status, or any other category protected under federal, state or local law._ _If you need a reasonable accommodation because of a disability for any part of the employment process, please contact Human Resources and let us know the nature of your request and your contact information._ Advance Local Media does not provide sponsorship for work visas or employment authorization in the United States. Only candidates who are legally authorized to work in the U.S. will be considered for this position.
    $120k-140k yearly 20d ago
  • Sr Data Engineer (MFT - IBM Sterling)

    The Hertz Corporation 4.3company rating

    Data engineer job in Salem, OR

    **A Day in the Life:** The IBM Sterling Senior MFT Engineer will have expertise in Sterling B2B Integrator, Sterling File Gateway, Sterling Secure Proxy, Control Center. This position requires an expert level knowledge of these technologies. You'll provide third-level support for core hardware, software, data and infrastructure components within the B2B integration team. The software focus areas will include software infrastructure components, continuous improvement and automation related to middleware and enterprise data movement within a mission-critical, multi-site enterprise environment. The ideal candidate will have a passion for technology; possess the ability to create change and be the facilitator for this transformation. They will have experience tailoring software design, developing, and implementing of solutions in support of data quality and automation. They will need to understand the requirements of a project, test the performance of the infrastructure, and verify that the requirements have been successfully met. We expect the starting salary to be around $135k but will be commensurate with experience. **What You'll Do:** TECHNICAL SENIORSHIP + Communication with internal and external business users on Sterling Integrator mappings + Making changes to existing partner integrations to meet internal and external requirements + Design, develop and implement solutions based on standards and processes that establish consistency across the enterprise data, reduce risks, and promote efficiencies in support of the organizational goals and objectives. + Diagnose and troubleshoot complex issues, restore services and perform root cause analysis. + Facilitate the review, vetting of these designs with the architecture governance bodies, as required. + Be aware of all aspects of security related to the Sterling environment and integrations INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING + Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows. TEAMWORK & COMMUNICATION + Superior & demonstrated team building & development skills to harness powerful teams + Ability to communicate effectively with different levels of Seniorship within the organization + Provide timely updates so that progress against each individual incident can be updated as required + Write and review high quality technical documentation CONTROL & AUDIT + Ensures their workstation and all processes and procedures, follow organization standards CONTINUOUS IMPROVEMENT + Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set. **What We're Looking For:** + Bachelor's degree in Engineering, Statistics, Computer Science or other quantitative fields, required + 5+ years of IT experience + 4+ years' experience working with Sterling B2B Integrator, Sterling File Gateway, PCM (preferred) + 3+ years' experience with scripting to enable automation of standard activities (example: Ansible, Python, Bash, Java) + Strong interpersonal and communication skills with Agile/Scrum experience. + Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions with the ability to develop custom setups. + Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels. + Prefer Travel, transportation, or hospitality experience + Prefer experience with designing application data models for mobile or web applications + Excellent written and verbal communication skills. + Flexibility in scheduling which may include nights, weekends, and holidays **What You'll Get:** + Up to 40% off the base rate of any standard Hertz Rental + Paid Time Off + Medical, Dental & Vision plan options + Retirement programs, including 401(k) employer matching + Paid Parental Leave & Adoption Assistance + Employee Assistance Program for employees & family + Educational Reimbursement & Discounts + Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness + Perks & Discounts -Theme Park Tickets, Gym Discounts & more The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world. **US EEO STATEMENT** At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company. Individuals are encouraged to apply for positions because of the characteristics that make them unique. EOE, including disability/veteran
    $135k yearly 60d+ ago

Learn more about data engineer jobs

How much does a data engineer earn in Portland, OR?

The average data engineer in Portland, OR earns between $72,000 and $137,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Portland, OR

$99,000

What are the biggest employers of Data Engineers in Portland, OR?

The biggest employers of Data Engineers in Portland, OR are:
  1. Genoa
  2. WebMD
  3. Moda Health
  4. Nike
  5. Career-Mover
  6. Autodesk
  7. Accenture
  8. Advance Local
  9. Certified Languages International
  10. Ernst & Young
Job type you want
Full Time
Part Time
Internship
Temporary