HYBRID ONISTE IN BEAVERTON, OR!
MUST HAVE QUALTRICS EXP
We're seeking a skilled and experienced Software Engineer who specializes in Qualtrics. This role will be part of a high-visibility, high-impact initiative to optimize and expand our Qualtrics environment.
You'll play a key role in designing, developing, and maintaining scalable solutions that enhance user experience, streamline data collection, and improve reporting accuracy. The ideal candidate has a strong background in Qualtrics architecture, API integrations, and automation-plus a passion for creating efficient, user-friendly tools that empower teams to make data-driven decisions.
What we're looking for:
3+ years of hands-on Qualtrics engineering or development experience
Strong understanding of survey logic, workflows, APIs, and automation
Experience with data visualization and analytics tools (Tableau, Power BI, etc.)
Background in software engineering (JavaScript, Python, or similar)
Ability to partner cross-functionally with researchers, analysts, and product teams
$77k-108k yearly est. 3d ago
Looking for a job?
Let Zippia find it for you.
Application Support Engineer
Cvent 4.3
Data engineer job in Portland, OR
Pacific Time zone working hours (9am - 6pm PT)
Our Culture and Impact
Cvent is a leading meetings, events, and hospitality technology provider with more than 5,000+ employees and 24,000+ customers worldwide, including 60% of the Fortune 500. Founded in 1999, Cvent delivers a comprehensive event marketing and management platform for marketers and event professionals and offers software solutions to hotels, special event venues and destinations to help them grow their group/MICE and corporate travel business. Our technology brings millions of people together at events around the world. In short, we're transforming the meetings and events industry through innovative technology that powers the human connection.
Cvent's strength lies in its people, fostering a culture where everyone is encouraged to think like entrepreneurs, taking risks and making decisions confidently. We value diverse perspectives and celebrate differences, working together with colleagues and clients to build strong connections.
AI at Cvent: Leading the Future
Are you ready to shape the future of work at the intersection of human expertise and AI innovation? At Cvent, we're committed to continuous learning and adaptation-AI isn't just a tool for us, it's part of our DNA. We're looking for candidates who are eager to evolve alongside technology. If you love to experiment boldly, share your discoveries, and help define best practices for AI-augmented work, you'll thrive here. Our team values professionals who thoughtfully integrate AI into their daily work, delivering exceptional results while relying on the human judgment and creativity that drive real innovation.
Throughout our interview process, you'll have the chance to demonstrate how you use AI to learn, iterate, and amplify your impact. If you're excited to be part of a team that's leading the way in AI-powered collaboration, we'd love to meet you.
Do you have a passion for technology? Do you enjoy solving real world problems? Do you have a helpful and inquisitive mindset? If you answered yes to all three questions, then keep reading!
This entry level Application Support Engineer opportunity is a jack of all technology trades. This role is technical in nature and provides exposure to all major aspects of cloud-based software (debugging/coding, test, networking, database/infrastructure).
Our end goal is to ensure our customers have the best possible experience with our products from a technical perspective. This entails doing everything we can to make sure that our products are bug free and have top-notch performance. As far as important candidate qualities go, strong communication and the ability to work with multiple teams is a must. We care more about your attitude and aptitude than the tools and technologies you have used in the past.
In This Role, You Will:
Provide top-tier software support for Cvent's product offerings to our customer service team and clients. This role is not customer facing.
Assist operations and development teams with debugging software issues.
Query databases to generate and analyze data for reporting and troubleshooting purposes.
Work with our sales engineering team to ensure the successful operation of our partner and client integrations.
Work with multiple teams to find, analyze, and resolve client issues.
Troubleshoot and maintain frontend and backend systems.
Monitor, document and report system and performance issues.
Facilitate communication between technology teams and other departments on issue status and resolution.
Supply in-depth technical and business product knowledge to multiple teams.
Weekend on-call support on a rotational basis is required for this position.
Here's What You Need:
Do not worry if you do not know all the specific technologies listed below! Our training program lasts 3-4 weeks and will help bring you up-to-speed on both our products and the technologies they use.
BS in Computer Science, Information Systems or equivalent major with strong academic performance
Excellent problem solving and analytical skills.
Understanding of relational databases and how to query data using SQL.
Working knowledge of HTML/CSS.
Understanding of the Software Development Life Cycle.
Solid knowledge of at least one object-oriented programming language.
Outstanding oral and written communication skills.
Ability to convey technical information to a general audience.
Aptitude for learning new technologies.
Zealous attention to detail.
Wondering what other technologies and tools we use? See below. Any experience with these is a plus!
Monitoring tools: NewRelic, Splunk, Datadog.
Hosting: Amazon Web Services
Programming: Java, C#, .Net, Node.js
Open source and NoSQL database technologies: Couchbase, Elasticsearch, RabbitMQ
APIs: SOAP or REST based
Build & deploy technologies: Docker and Jenkins
Version control: Git
The estimated base salary range for new hires into this role is $85-$120k+ annually + bonus depending on factors such as job-related knowledge, relevant experience, and location. We also offer a competitive benefits package, details of which can be found here.”
W
e are not able to offer sponsorship for this position
$85k-120k yearly 2d ago
Data Scientist 4
Lam Research 4.6
Data engineer job in Tualatin, OR
Develop tools, metric measurement and assessment methods for performance management and predictive modeling. Develop dashboards for product management and executives to drive faster and better decision making Create accountability models for DPG-wide quality, I&W, inventory, product management KPI's and business operations.
Improve DPG-wide quality, install and warranty, and inventory performance consistently from awareness, prioritization, and action through the availability of common data.
Collaborate with quality, install, and warranty, and inventory program managers to analyze trends and patterns in data that drive required improvement in key performance indicators (KPIs) Foster growth and utility of Cost of Quality within the company through correlation of I&W data, ECOGS, identification of causal relationships for quality events and discovery of hidden costs throughout the network.
Improve data utilization via AI and automation leading to real time resolution and speeding systemic action.
Lead and/or advise on multiple projects simultaneously and demonstrate organizational, prioritization, and time management proficiencies.
Bachelor's degree with 8+ years of experience; or master's degree with 5+ years' experience; or equivalent experience.
Basic understand of AI and machine learning and ability to work with Data Scientist to use AI to solve complex challenging problems leading to efficiency and effectiveness improvements.
Ability to define problem statements and objectives, development of analysis solution approach, execution of analysis.
Basic knowledge of Lean Six Sigma processes, statistics, or quality systems experience.
Ability to work on multiple problems simultaneously.
Ability to present conclusions and recommendations to executive audiences.
Ownership mindset to drive solutions and positive outcomes.
Excellent communication and presentation skills with the ability to present to audiences at multiple levels in the Company.
Willingness to adapt best practices via benchmarking.
Experience in Semiconductor fabrication, Semiconductor Equipment Operations, or related industries is a plus.
Demonstrated ability to change process and methodologies for capturing and interpreting data.
Demonstrated success in using structured problem-solving methodologies and quality tools to solve complex problems.
Knowledge of programming environments such as Python, R, Matlab, SQL or equivalent.
Experience in structured problem-solving methodologies such as PDCA, DMAIC, 8D and quality tools.
Our commitment We believe it is important for every person to feel valued, included, and empowered to achieve their full potential.
By bringing unique individuals and viewpoints together, we achieve extraordinary results.
Lam is committed to and reaffirms support of equal opportunity in employment and non-discrimination in employment policies, practices and procedures on the basis of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex (including pregnancy, childbirth and related medical conditions), gender, gender identity, gender expression, age, sexual orientation, or military and veteran status or any other category protected by applicable federal, state, or local laws.
It is the Company's intention to comply with all applicable laws and regulations.
Company policy prohibits unlawful discrimination against applicants or employees.
Lam offers a variety of work location models based on the needs of each role.
Our hybrid roles combine the benefits of on-site collaboration with colleagues and the flexibility to work remotely and fall into two categories - On-site Flex and Virtual Flex.
'On-site Flex' you'll work 3+ days per week on-site at a Lam or customer/supplier location, with the opportunity to work remotely for the balance of the week.
'Virtual Flex' you'll work 1-2 days per week on-site at a Lam or customer/supplier location, and remotely the rest of the time.
$71k-91k yearly est. 12d ago
Data Scientist
Eyecarecenterofsalem
Data engineer job in Portland, OR
Job DescriptionWe are looking for a Data Scientist to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights. In this role, you should be highly analytical with a knack for analysis, math, and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine learning and research.
Your goal will be to help our company analyze trends to make better decisions.
Responsibilities
Identify valuable data sources and automate collection processes
Undertake to preprocess of structured and unstructured data
Analyze large amounts of information to discover trends and patterns
Build predictive models and machine-learning algorithms
Combine models through ensemble modeling
Present information using data visualization techniques
Propose solutions and strategies to business challenges
Collaborate with engineering and product development teams
Requirements and skills
Proven experience as a Data Scientist or Data Analyst
Experience in data mining
Understanding of machine learning and operations research
Knowledge of R, SQL, and Python; familiarity with Scala, Java, or C++ is an asset
Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop)
Analytical mind and business acumen
Strong math skills (e.g. statistics, algebra)
Problem-solving aptitude
Excellent communication and presentation skills
BSc/BA in Computer Science, Engineering, or relevant field; a graduate degree in Data Science or other quantitative field is preferred
$73k-104k yearly est. 18d ago
Need Sr Big Data Engineer at Beaverton, OR Only W2
USM 4.2
Data engineer job in Beaverton, OR
Hi,
We have immediate opportunity with our direct client send your resume asap upon your interest. Thank you.
Sr Big DataEngineer
Duration: Long Term
Skills
Typical Office: This is a typical office job, with no special physical requirements or unusual work environment.
Core responsibilities: Expert dataengineers will work with product teams across client to help automate and integrate various of data domains with a wide variety of data profiles (different scale, cadence, and volatility) into client's next-gen data and analytics platform. This is an opportunity to work across multiple subject areas and source platforms to ingest, organize, and prepare data through cloud-native processes.
Required skills/experience:
- 5+ years of professional development experience between either Python (preferred) or Scala/Java; familiarity with both is ideal
- 5+ years of data-centric development with a focus on efficient data access and manipulation at multiple scales
- 3+ years of experience with the HDFS ecosystem of tools (any distro, Spark experience prioritized)
- 3+ years of significant experience developing within the broader AWS ecosystem of platforms and services
- 3+ years of experience optimizing data access and analysis in non-HDFS data platforms (Traditional RDMS's, NoSQL / KV stores, etc)
- Direct task development and/or configuration experience with a remote workflow orchestration tool - Airflow (preferred), Amazon Data Pipeline, Luigi, Oozie, etc.
- Intelligence, strong problem-solving ability, and the ability to effectively communicate to partners with a broad spectrum of experiential backgrounds
Several of the following skills are also desired:
- A demonstrably strong understanding of security and credential management between application / platform components
- A demonstrably strong understanding of core considerations when working with data at scale, in both file-based and database contexts, including SQL optimization
- Direct experience with Netflix Genie is another huge plus
- Prior experience with the operational backbone of a CI/CD environment (pipeline orchestration + configuration management) is useful
- Clean coding practices, passion for development, being a generally good team player, etc, etc (and experience with GitHub) is always nice
Keys to Success:
- Deliver exceptional customer service
- Demonstrate accountability and integrity
- Be willing to learn and adapt every day
- Embrace change Skills
Regards
Nithya
Additional Information
All your information will be kept confidential according to EEO guidelines. please send the profiles to ************************* and contact No# ************.
$93k-132k yearly est. Easy Apply 60d+ ago
Data Warehouse Developer
Banfield Pet Hospital 3.8
Data engineer job in Vancouver, WA
This position requires an onsite presence at the Banfield Pet Hospital headquarters in Vancouver, Washington, with a hybrid work schedule (3 days/week). Summary and Qualifications: The Data Warehouse Developer contributes at the enterprise level to the development of designing, building, testing, and maintaining data products and applications. They will work with programming languages and data tools and technologies to create data consumption solutions for internal end users. They will collaborate with the Data Architects and leads of the team to implement and maintain data products under the direction of the Architects/Leads. They will work within the development team in collaboration with internal, external development teams, and product managers.
Essential Responsibilities and Tasks:
+ Live and exemplify the Five Principles of Mars, Inc. within self and team.
+ Spend 80% of time in active development tasks within the designated data platform primarily on Azure/Cloud.
+ Apply modern development practices in data management, configuration, development, and extension of the designated platform within the Mars Veterinary Health (MVH) and Banfield environment.
+ Analyze user needs and translate them into software specifications, converting business requirements into stories and work items for the platform backlog.
+ Execute tasks from the team backlog.
+ Work independently with support from the Architect/Lead to enable systems integrations.
+ Write and implement clean, scalable code.
+ Test and deploy quality applications, regularly assessing for improvements.
+ Develop technical documentation to support future software development projects
+ Recommend and execute program improvements.
+ Other job duties as assigned.
Special Working Conditions:
+ Ability to work at a computer for long periods of time.
+ Must have mental processes for reasoning, remembering, mathematics, and language ability (reading, writing, and speaking the English language) to perform the duties proficiently.
+ Ability to carry out instructions furnished in written, oral, or diagram form and to solve problems involving several variables.
+ Ability to stand, walk, stoop, kneel, crouch, and climb as well as manipulate (lift, carry, move) up to 25 pounds.
+ Requires good hand-eye coordination, arm-hand-finger dexterity with the ability to grasp, and visual acuity to use a keyboard and operate necessary equipment.
+ The noise level in the work environment is normally moderate.
+ Environment where pets are present.
+ The physical demands and work environment characteristics described here are representative of those that must be met by an associate to successfully perform the essential functions of this position. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Experience, Education and/or Training:
+ Bachelor's degree in Computer Science or a related field is required, or the equivalent combination of education, training and experience that provides the required knowledge, skills, and abilities.
+ A minimum of five years of data warehousing and analytics experience are required.
+ Five years of experience working directly with data warehousing and analytical tools in Azure and Databricks are required.
+ Demonstrated proficiency and experience with data warehouse implementation methodologies (Kimball/Inmon), scripting for data processing (Python/Spark), Azure Data Bricks for data processing and engineering are required.
+ Understanding of data warehousing/engineering principles and patterns is required.
+ Strong analytical and problem-solving skills are required.
+ Effective written and verbal communication skills to collaborate with colleagues and team members are required.
+ Knowledge data warehousing development lifecycle is required.
+ Willingness to learn and adapt to new technologies and methodologies is required.
+ Azure DataEngineer and/or Associate DataEngineerDatabricks certification is preferred.
+ Experience in Oracle/Teredata/Netezza data warehouse is preferred.
+ Knowledge of object-relational mapping frameworks is preferred.
+ Experience with Agile and Scrum development methodologies is preferred.
Salary Range:
The pay range for this role is $93,707 - $128,847.
The pay range listed reflects a general hiring range for the area, with the specific rate determined based on the candidate's experience, skill level, abilities, and education, and may vary depending on location and schedule.
This posting will remain open for a minimum of two weeks or until a sufficient pool of qualified applicants has been received.
Benefits:
Here at Banfield, we prioritize your well-being and professional growth by offering a comprehensive total rewards package, including health, wellness, and financial support for you, your family, and even your pets. Check out some of our "Meow-velous" benefits:
+ Comprehensive Medical, Dental, and Vision Insurance: Enjoy peace of mind knowing your health and wellness are our top priorities. We've got your essential medical, dental, and vision care covered.
+ Generous Retirement Plans (401(k) and Roth): Invest in your future and enjoy a generous company match to help you build a secure financial future.*
+ Paid Time Off and Holidays: Take a break, recharge your wellbeing, and celebrate days of personal significance with paid time off and holidays.*
+ Top-Tier Mental Health and Wellbeing Resources: Your mental health matters. Access our industry-leading resources, including free coaching and counseling sessions, to support your overall wellbeing and help you thrive.*
+ Associate Life Insurance (company-paid) & Supplemental Life Insurance: Protect your loved ones with our company-paid Associate life insurance and have the option to purchase additional coverage for extra peace of mind.
+ Company-Paid Short- and Long-Term Disability: Feel secure knowing that if you face a temporary or long-term disability, you'll have financial protection.
+ Flexible Spending Accounts (FSA): Save on healthcare and dependent care expenses by setting aside pre-tax money. It's a smart way to manage your budget and take care of your needs.
+ Health Savings Account (HSA): Make the most of your healthcare dollars with a tax-advantaged HSA, allowing you to pay for medical expenses with pre-tax funds.
+ Paid Parental Leave: We support growing families with paid parental leave for both birth and adoption, giving you precious time to bond with your new family addition.
+ Continuing Education Allowance (for Eligible Positions): Banfield is committed to supporting the professional growth of our Associates. This allowance provides financial assistance to pursue continuing education opportunities.*
+ Back-Up Child and Elder Care & Family Support Resources: When life's unpredictable moments arise, our backup care and family support benefits provide the help you need to keep things running smoothly.*
+ Fertility and Family Building Support: We're here for you on your journey to parenthood, offering comprehensive support for fertility treatments and family-building options.
+ Digital Exercise Therapy: Stay active and healthy with our digital exercise therapy program, designed to fit your busy lifestyle, and keep you moving.
+ Voluntary Protection Benefits: Get peace of mind with protection against the unexpected. You can purchase coverage to help support you financially during hospital stays, critical illness, and accidents.*
+ Legal Plan: Gain extra peace of mind with our affordable and accessible legal plan which includes coverage for a wide range of legal needs.*
+ Identity Protection: Identity Protection helps safeguard your personal information by alerting you to suspicious activity and providing support if your information is stolen.*
+ Commuter Benefits: Say goodbye to commuting stress with our commuter benefits, making your daily journey more convenient and cost-effective.*
+ Three Free Optimum Wellness Plans for Pets: We care about your furry friends too! Enjoy three free wellness plans to ensure your pets receive the best preventive and general care.*
+ Exclusive Discounts: Unlock a world of savings with our wide variety of exclusive discounts on products and services, making life more affordable and enjoyable.*
Benefits eligibility is based on employment status. Full-time (FT) Associates are eligible for all benefit programs; Part-time Associates are eligible for those benefits with an asterisk (*).
WE ARE A DRUG-FREE, SMOKE-FREE, EQUAL OPPORTUNITY EMPLOYER.
Banfield Pet Hospital strongly supports and values the uniqueness of all individuals and promotes a work environment where diversity is embraced. Banfield Pet Hospital is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, sex, sexual orientation, gender identity, age, genetic information, status as a protected veteran, or status as a qualified individual with disability. Banfield Pet Hospital complies with all applicable federal, state and local laws governing nondiscrimination in employment in every Banfield location. #FT
$93.7k-128.8k yearly 60d+ ago
Sr. Data Engineer
Concora Credit
Data engineer job in Beaverton, OR
As a Sr. DataEngineer, you'll help drive Concora Credit's Mission to enable customers to Do More with Credit - every single day.
The impact you'll have at Concora Credit:
We are seeking a Sr. DataEngineer with deep expertise in Azure and Databricks to lead the design, development, and optimization of scalable data pipelines and platforms. You'll be responsible for building robust data solutions that power analytics, reporting, and machine learning across the organization using Azure cloud services and Databricks.
We hire people, not positions. That's because, at Concora Credit, we put people first, including our customers, partners, and Team Members. Concora Credit is guided by a single purpose: to help non-prime customers
do more
with credit. Today, we have helped millions of customers access credit. Our industry leadership, resilience, and willingness to adapt ensure we can help our partners responsibly say yes to millions more. As a company grounded in entrepreneurship, we're looking to expand our team and are looking for people who foster innovation, strive to make an impact, and want to Do More! We're an established company with over 20 years of experience, but now we're taking things to the next level. We're seeking someone who wants to impact the business and play a pivotal role in leading the charge for change.
Responsibilities
As our Sr. DataEngineer, you will:
Design and develop scalable, efficient data pipelines using Azure Databricks
Build and manage data ingestion, transformation, and storage solutions leveraging Azure Data Factory, Azure Data Lake, and Delta Lake
Implement CI/CD for data workflows using tools like Azure DevOps, Git, and Terraform
Optimize performance and cost efficiency across large-scale distributed data systems
Collaborate with analysts, data scientists, and business stakeholders to understand data needs and deliver reliable, reusable datasets
Provide guidance and mentor junior engineers and actively contribute to data platform best practices
Monitor, troubleshoot, and optimize existing pipelines and infrastructure to ensure reliability and scalability
These duties must be performed with or without reasonable accommodation.
We know experience comes in many forms and that many skills are transferable. If your experience is close to what we're looking for, consider applying. Diversity has made us the entrepreneurial and innovative company that we are today.
Qualifications
Requirements:
5+ years of experience in dataengineering, with a strong focus on Azure cloud technologies
Experience with Azure Databricks, Azure Data Lake, Data Factory including PySpark, SQL, Python and Delta Lake
Strong proficiency in Databricks and Apache Spark
Solid understanding of data warehousing, ETL/ELT, and data modeling best practices
Experience with version control, CI/CD pipelines, and infrastructure as code
Knowledge of Spark performance tuning, partitioning, and job orchestration
Excellent problem-solving skills and attention to detail
Strong communication and collaboration abilities across technical and non-technical teams
Ability to work independently and lead in a fast-paced, agile environment
Passion for delivering clean, high-quality, and maintainable code
Preferred Qualifications:
Experience with Unity Catalog, Databricks Workflows, and Delta Live Tables
Familiarity with DevOps practices or Terraform for Azure resource provisioning
Understanding of data security, RBAC, and compliance in cloud environments
Experience integrating Databricks with Power BI or other analytics platforms
Exposure to real-time data processing using Kafka, Event Hubs, or Structured Streaming
What's In It For You:
Medical, Dental and Vision insurance for you and your family
Relax and recharge with Paid Time Off (PTO)
6 company-observed paid holidays, plus 3 paid floating holidays
401k (after 90 days) plus employer match up to 4%
Pet Insurance for your furry family members
Wellness perks including onsite fitness equipment at both locations, EAP, and access to the Headspace App
We invest in your future through Tuition Reimbursement
Save on taxes with Flexible Spending Accounts
Peace of mind with Life and AD&D Insurance
Protect yourself with company-paid Long-Term Disability and voluntary Short-Term Disability
Concora Credit provides equal employment opportunities to all Team Members and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
Employment-based visa sponsorship is not available for this role.
Concora Credit is an equal opportunity employer (EEO).
Please see the Concora Credit Privacy Policy for more information on how Concora Credit processes your personal information during the recruitment process and, if applicable, based on your location, how you can exercise your privacy rights. If you have questions about this privacy notice or need to contact us in connection with your personal data, including any requests to exercise your legal rights referred to at the end of this notice, please contact caprivacynotice@concoracredit.com.
$84k-118k yearly est. Auto-Apply 60d+ ago
Sr.Hadoop Developer
Bridge Tech 4.2
Data engineer job in Beaverton, OR
Job DescriptionTypically requires a Bachelors Degree and minimum of 5 years directly relevant work experience Client is embarking on the big data platform in Consumer Digital using Hadoop Distributed File System cluster. As a Sr. Hadoop developer you will work with a variety of talented client team mates and be a driving force for building solutions. You will be working on development projects related to commerce and web analytics.
Responsibilities:
•Design and implement map reduce jobs to support distributed processing using java, cascading, python, hive and pig; Ability to design and implement end to end solution.
•Build libraries, user defined functions, and frameworks around Hadoop
•Research, evaluate and utilize new technologies/tools/frameworks around Hadoop eco system
•Develop user defined functions to provide custom hive and pig capabilities
•Define and build data acquisitions and consumption strategies
•Define & develop best practices
•Work with support teams in resolving operational & performance issues
•Work with architecture/engineering leads and other teams on capacity planning
QualificationsQualification:
•MS/BS degree in a computer science field or related discipline
•6+ years' experience in large-scale software development
•1+ year experience in Hadoop
•Strong Java programming, shell scripting, Python, and SQL
•Strong development skills around Hadoop, MapReduce, Hive, Pig, Impala
•Strong understanding of Hadoop internals
•Good understanding of AVRO and Json and other compresssion
•Experience with build tools such as Maven
•Experience with databases like Oracle;
•Experience with performance/scalability tuning, algorithms and computational complexity
•Experience (at least familiarity) with data warehousing, dimensional modeling and ETL development
•Ability to understand and ERDs and relational database schemas
•Proven ability to work cross functional teams to deliver appropriate resolution
Nice to have:
•Experience with open source NOSQL technologies such as HBase and Cassandra
•Experience with messaging & complex event processing systems such as Kafka and Storm
•Machine learning framework
•Statistical analysis with Python, R or similar
Additional Information
All your information will be kept confidential according to EEO guidelines.
$90k-118k yearly est. 60d+ ago
BigData Engineer / Architect
Nitor Infotech
Data engineer job in Portland, OR
The hunt is for a strong Big Data Professional, a team player with the ability to manage effective relationships with a wide range of stakeholders (customers & team members alike). Incumbent will demonstrate personal commitment and accountability to ensure standards are continuously sustained and improved both within the internal teams, and with partner organizations and suppliers.
Role:
Big DataEngineer
Location:
Portland OR.
Duration:
Full Time
Skill Matrix:
Map Reduce -
Required
Apache Spark -
Required
Informatica PowerCenter -
Required
Hive -
Required
Apache Hadoop -
Required
Core Java / Python -
Highly Desired
Healthcare Domain Experience -
Highly Desired
Job Description
Responsibilities and Duties
Participate in technical planning & requirements gathering phases including architectural design, coding, testing, troubleshooting, and documenting big data-oriented software applications.
Responsible for the ingestion, maintenance, improvement, cleaning, and manipulation of data in the business's operational and analytics databases, and troubleshoots any existent issues.
Implementation, troubleshooting, and optimization distributed solutions based on modern big data technologies like Hive, Hadoop, Spark, Elastic Search, Storm, Kafka, etc. in both an on premise and cloud deployment model to solve large scale processing problems
Design, enhance and implement ETL/data ingestion platform on the cloud.
Strong Data Warehousing skills, including: Data clean-up, ETL, ELT and handling scalability issues for enterprise level data warehouse
Capable of investigating, familiarizing and mastering new data sets quickly
Strong troubleshooting and problem-solving skills in large data environment
Experience with building data platform on cloud (AWS or Azure)
Experience in using Python, Java or any other language to solving data problems
Experience in implementing SDLC best practices and Agile methods.
Qualifications
Required Skills:
Data architecture/ Big Data/ ETL environment
Experience with ETL design using tools Informatica, Talend, Oracle Data Integrator (ODI), Dell Boomi or equivalent
Big Data & Analytics solutions Hadoop, Pig, Hive, Spark, Spark SQL Storm, AWS (EMR, Redshift, S3, etc.)/Azure (HDInsight, Data Lake Design)
Building and managing hosted big data architecture, toolkit familiarity in: Hadoop with Oozy, Sqoop, Pig, Hive, HBase, Avro, Parquet, Spark, NiFi
Foundational data management concepts - RDM and MDM
Experience in working with JIRA/Git/Bitbucket/JUNIT and other code management toolsets
Strong hands-on knowledge of/using solutioning languages like: Java, Scala, Python - any one is fine
Healthcare Domain knowledge
Required Experience, Skills and Qualifications
Qualifications:
Bachelor's Degree with a minimum of 6 to 9 + year's relevant experience or equivalent.
Extensive experience in data architecture/Big Data/ ETL environment.
Additional Information
All your information will be kept confidential according to EEO guidelines.
$84k-118k yearly est. 1d ago
Sr. Data Engineer
It Vision Group
Data engineer job in Portland, OR
Job Description
Title : Sr. DataEngineer
Duration: 12 Months+
Roles & Responsibilities
Perform data analysis according to business needs
Translate functional business requirements into high-level and low-level technical designs
Design and implement distributed data processing pipelines using Apache Spark, Apache Hive, Python, and other tools and languages prevalent in a modern analytics platform
Create and schedule workflows using Apache Airflow or similar job orchestration tooling
Build utilities, functions, and frameworks to better enable high-volume data processing
Define and build data acquisitions and consumption strategies
Build and incorporate automated unit tests, participate in integration testing efforts
Work with teams to resolve operational & performance issues
Work with architecture/engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and followed.
Tech Stack
Apache Spark
Apache Spark Streaming using Apache Kafka
Apache Hive
Apache Airflow
Python
AWS EMR and S3
Snowflake
SQL
Other Tools & Technologies :: PyCharm, Jenkin, Github.
Apache Nifi (Optional)
Scala (Optional)
$84k-118k yearly est. 14d ago
Senior Data Engineer
Advance Local 3.6
Data engineer job in Portland, OR
**Advance Local** is looking for a **Senior DataEngineer** to design, build, and maintain the enterprise data infrastructure that powers our cloud data platform. This position will combine your deep technical expertise in dataengineering with team leadership responsibilities for dataengineering, overseeing the ingestion, integration, and reliability of data systems across Snowflake, AWS, Google Cloud, and legacy platforms. You'll partner with data product and across business units to translate requirements into technical solutions, integrate data from numerous third-party platforms, (CDPs, DMPs, analytics platforms, marketing tech) into central data platform, collaborate closely with the Data Architect on platform strategy and ensure scalable, well-engineered solutions for modern data infrastructure using infrastructure as code and API-driven integrations.
The base salary range is $120,000 - $140,000 per year.
**What you'll be doing:**
+ Lead the design and implementation of scalable data ingestion pipelines from diverse sources into Snowflake.
+ Partner with platform owners across business units to establish and maintain data integrations from third party systems into the central data platform.
+ Architect and maintain data infrastructure using IAC, ensuring reproducibility, version control and disaster recovery capabilities.
+ Design and implement API integrations and event-driven data flows to support real time and batch data requirements.
+ Evaluate technical capabilities and integration patterns of existing and potential third-party platforms, advising on platform consolidation and optimization opportunities.
+ Partner with the Data Architect and data product to define the overall data platform strategy, ensuring alignment between raw data ingestion and analytics-ready data products that serve business unit needs.
+ Develop and enforce dataengineering best practices including testing frameworks, deployment automation, and observability.
+ Support rapid prototyping of new data products in collaboration with data product by building flexible, reusable data infrastructure components.
+ Design, develop, and maintain scalable data pipelines and ETL processes; optimize and improve existing data systems for performance, cost efficiency, and scalability.
+ Collaborate with data product, third-party platform owners, Data Architects, Analytics Engineers, Data Scientists, and business stakeholders to understand data requirements and deliver technical solutions that enable business outcomes across the organization.
+ Implement data quality validation, monitoring, and alerting systems to ensure reliability of data pipelines from all sources.
+ Develop and maintain comprehensive documentation for dataengineering processes and systems, architecture, integration patterns, and runbooks.
+ Lead incident response and troubleshooting efforts for data pipeline issues, ensuring minimal business impact.
+ Stay current with the emerging dataengineering technologies, cloud services, SaaS platform capabilities, and industry best practices.
**Our ideal candidate will have the following:**
+ Bachelor's degree in computer science, engineering, or a related field
+ Minimum of seven years of experience in dataengineering with at least two years in a lead or senior technical role
+ Expert proficiency in Snowflake dataengineering patterns
+ Strong experience with AWS services (S3, Lambda, Glue, Step Functions) and Google Cloud Platform
+ Experience integrating data from SaaS platforms and marketing technology stacks (CDPs, DMPs, analytics platforms, CRMs)
+ Proven ability to work with third party APIs, webhooks, and data exports
+ Experience with infrastructure such as code (Terraform, Cloud Formation) and CI/CD pipelines for data infrastructure
+ Proven ability to design and implement API integrations and event-driven architecture
+ Experience with data modeling, data warehousing, and ETL processes at scale
+ Advanced proficiency in Python and SQL for data pipeline development
+ Experience with data orchestration tools (airflow, dbt, Snowflake tasks)
+ Strong understanding of data security, access controls, and compliance requirements
+ Ability to navigate vendor relationships and evaluate technical capabilities of third-party platforms
+ Excellent problem-solving skills and attention to detail
+ Strong communication and collaboraion skills
**Additional Information**
Advance Local Media offers competitive pay and a comprehensive benefits package with affordable options for your healthcare including medical, dental and vision plans, mental health support options, flexible spending accounts, fertility assistance, a competitive 401(k) plan to help plan for your future, generous paid time off, paid parental and caregiver leave and an employee assistance program to support your work/life balance, optional legal assistance, life insurance options, as well as flexible holidays to honor cultural diversity.
Advance Local Media is one of the largest media groups in the United States, which operates the leading news and information companies in more than 20 cities, reaching 52+ million people monthly with our quality, real-time journalism and community engagement. Our company is built upon the values of Integrity, Customer-first, Inclusiveness, Collaboration and Forward-looking. For more information about Advance Local, please visit ******************** .
Advance Local Media includes MLive Media Group, Advance Ohio, Alabama Media Group, NJ Advance Media, Advance Media NY, MassLive Media, Oregonian Media Group, Staten Island Media Group, PA Media Group, ZeroSum, Headline Group, Adpearance, Advance Aviation, Advance Healthcare, Advance Education, Advance National Solutions, Advance Originals, Advance Recruitment, Advance Travel & Tourism, BookingsCloud, Cloud Theory, Fox Dealer, Hoot Interactive, Search Optics, Subtext.
_Advance Local Media is proud to be an equal opportunity employer, encouraging applications from people of all backgrounds. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, genetic information, national origin, age, disability, sexual orientation, marital status, veteran status, or any other category protected under federal, state or local law._
_If you need a reasonable accommodation because of a disability for any part of the employment process, please contact Human Resources and let us know the nature of your request and your contact information._
Advance Local Media does not provide sponsorship for work visas or employment authorization in the United States. Only candidates who are legally authorized to work in the U.S. will be considered for this position.
$120k-140k yearly 48d ago
Data Architect
The Weir Group PLC
Data engineer job in Portland, OR
Weir ESCO Portland OR(Remote) Purpose of Role: Weir requires an experienced Architect to join the BI & Analytics team, tasked with leading a successful, mature BI program into modern architecture and technologies. This role will work with system administrators, analysts, developers and other architects globally to design architectural solutions that leverage the best components of our existing architecture in modern technology and expand our environment to meet the growing needs of GenAI and AI. Additionally, the BI Data Architect will participate as a key member of a high performing global team operating under Scaled Agile. The role is within a globally distributed team and the ideal candidate will be highly collaborative, a self-starter, a champion for the team, and passionate about delivering quality solutions. We are all committed to working together and collaborating with our Global Weir family. The management of our own workload and time management is essential, and we work flexibly to meet our individual and team needs. We trust that working together and supporting each other we are able to undertake the best work of our lives.
Why choose Weir:
Be part of a global organization dedicated to building a better future: At Weir, the growing world depends on us. It depends on us constantly reinventing, quickly adapting and continually finding better, faster, more sustainable ways to access the resources it needs to thrive. And it depends on each of us doing the best work of our lives. It's a big challenge - but it is exciting.
An opportunity to grow your own way: Everything moves fast in the dynamic world of Weir. This creates opportunities for us to take on new challenges, explore new areas, learn, progress and excel. Best of all, there is no set path that our people must take. Instead, everyone is given the support and freedom to tailor-make their own career and do the best work of their lives.
Feel empowered to be yourself and belong: Weir is a welcoming, inclusive place, where each individual's contribution is recognized and all employees are encouraged to innovate, collaborate and be themselves. We continually focus on people and their wellbeing. We believe in fairness and choose to be honest, transparent and authentic in everything we do.
Key Responsibilities:
* Architect Scalable Solutions: Translate business needs and enterprise guidelines into robust data and solution architectures. Own conceptual and enterprise data models and ensure alignment with best practices.
* Collaborate Across Teams: Work closely with business stakeholders, data stewards, architects, engineers, and analysts to deliver high-quality projects. Perform gap analysis between current and target states to guide execution.
* Lead Integration & Governance: Manage enterprise integrations and support development of secure data ingestion, sharing, and governance processes, including IoT and third-party data.
* Hands-On Technical Leadership: Provide guidance to dataengineers on writing maintainable, testable code; develop proof-of-concepts; troubleshoot and optimize systems to meet SLAs and drive continuous improvement.
* Drive Lifecycle & Roadmap: Oversee technology roadmap, track technical debt, and support Agile planning, product roadmaps, and risk management. Participate in end-to-end lifecycle from ideation to deployment.
* Maintain Expertise & Innovation: Stay current on data and analytics trends, compliance requirements, and emerging technologies. Collaborate with vendors and contribute to enterprise architecture and strategy discussions.
* Safety First: Demonstrate 100% commitment to our zero harm behaviours in support of our drive towards developing a world class safety culture.
Job Knowledge/Education and Qualifications:
* 10+ years of experience in Database Administration, 5+ years of experience in Microsoft SQL Server technologies (ex: SQL, SSIS, and SSAS and 3+ years of experience with Oracle ERP
* In depth experience of designing and implementing information systems, in a cloud environment, preferably Microsoft Azure
* Growth-Oriented: Demonstrates a strong learning mindset and adaptability to new technologies.
* Proactive & Independent: Self-starter who can work effectively with minimal supervision.
* Educational Background: Bachelor's degree in Computer Science or equivalent professional experience.
* Organized & Efficient: Exceptional time-management skills with the ability to prioritize multiple, competing tasks..
* Experience with data modeling and database development tools; familiarity with DBAmp for Salesforce integration; migrating from on-prem to cloud technologies; working with modern platforms such as Collibra, Databricks, or Microsoft Fabric; ability to create API integrations; and exposure to GenAI, AI/ML technologies.
Founded in 1871, Weir is a world leading engineering business with a purpose to make mining operations smarter, more efficient and sustainable. Thanks to Weir's technology, our customers can produce essential metals and minerals using less energy, water and waste at lower cost. With the increasing need for metals and minerals for climate change solutions, Weir colleagues are playing their part in powering a low carbon future. We are a global family of 11,000 uniquely talented people in over 60 counties, inspiring each other to do the best work of our lives.
For additional information about what it is like to work at Weir, please visit our Career Page and LinkedIn Life Page.
Weir is committed to an inclusive and diverse workplace. We are an equal opportunity employer and do not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, veteran status, disability, age, or any other legally protected status.
#esco
#LI-remote
#LI-LK1
Procom is a leading provider of professional IT services and staffing to businesses and governments in Canada. With revenues over $500 million, the Branham Group has recognized Procom as the 3rd largest professional services firm in Canada and is now the largest “Canadian-Owned” IT staffing/consulting company.
Procom's areas of staffing expertise include:
• Application Development
• Project Management
• Quality Assurance
• Business/Systems Analysis
• Datawarehouse & Business Intelligence
• Infrastructure & Network Services
• Risk Management & Compliance
• Business Continuity & Disaster Recovery
• Security & Privacy
Specialties• Contract Staffing (Staff Augmentation)
• Permanent Placement (Staff Augmentation)
• ICAP (Contractor Payroll)
• Flextrack (Vendor Management System)
Job Description
Consolidating data from multiple online trans-actional systems, scheduling tools, defect management tools, gathering and dropping info in a datawarehouse, then from there creating an online data tool. Need a Datawarehouse BI person - architecting and datawarehouse environment and building and extracting loads. The group has the need to have small applications built to gather data and the person needs to grow the value of this. Identifying what the group wants and growing the entire deliverable potential for other groups in the future as a model.
Job Duties: The candidate will possess SharePoint, .NET, Java and MS SQL skills and will apply those skills to create/extend ETL, SQL and application code for Sharepoint Business Intelligence and web applications. Candidate will also troubleshoot, debug, identify and correct problems related to the import and presentation of ETL data.
Qualifications
Strong development background in SharePoint BI or other Business Intelligence application.
Experienced in developing stored procedures, SSIS Packages, with advanced data development skills.
Solid software development skills including -Java, Javascript, HTML, T-SQL, CSS XML, ASP.NET
3-5 Years with recent required skills and 7-10 overall experience and with all tools.
Degree Type: BS in relevant field..CS, Engineering, etc.
Additional Information
$86k-117k yearly est. 60d+ ago
Big Data Architect
Testingxperts 4.0
Data engineer job in Hillsboro, OR
Greetings for the day! My name is Suneetha from Testing Xperts, we are a global staffing, consulting and technology solutions company, offering industry-specific solutions to our fortune 500 clients and worldwide corporations.
Role: Big Data Architect
Location: Hillsboro, OR
Duration: 6+ Months
Job Description:
·
Skills Python, Spark, Map Reduce, Hive, Pig, HBase, Sqoop Hands on experience in Python Spark.
·
At least 2 years Hands on experience in the Big Data ecosystem HDFS, Map Reduce, Hive, HBase, etc.
·
Experience working on large data sets Working experience with large scale Hadoop environments.
·
Build and support design for ingesting large data sets into Hadoop ecosystems, validating enriching the data, and distributing significant amounts of that data to SQL and to other applications.
Qualifications
Graduate
Additional Information
All your information will be kept confidential according to EEO guidelines.
$97k-136k yearly est. 1d ago
Google Cloud Data & AI Engineer
Slalom 4.6
Data engineer job in Portland, OR
Who You'll Work With As a modern technology company, our Slalom Technologists are disrupting the market and bringing to life the art of the possible for our clients. We have passion for building strategies, solutions, and creative products to help our clients solve their most complex and interesting business problems. We surround our technologists with interesting challenges, innovative minds, and emerging technologies
You will collaborate with cross-functional teams, including Google Cloud architects, data scientists, and business units, to design and implement Google Cloud data and AI solutions. As a Consultant, Senior Consultant or Principal at Slalom, you will be a part of a team of curious learners who lean into the latest technologies to innovate and build impactful solutions for our clients.
What You'll Do
* Design, build, and operationalize large-scale enterprise data and AI solutions using Google Cloud services such as BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub and more.
* Implement cloud-based data solutions for data ingestion, transformation, and storage; and AI solutions for model development, deployment, and monitoring, ensuring both areas meet performance, scalability, and compliance needs.
* Develop and maintain comprehensive architecture plans for data and AI solutions, ensuring they are optimized for both data processing and AI model training within the Google Cloud ecosystem.
* Provide technical leadership and guidance on Google Cloud best practices for dataengineering (e.g., ETL pipelines, data pipelines) and AI engineering (e.g., model deployment, MLOps).
* Conduct assessments of current data architectures and AI workflows, and develop strategies for modernizing, migrating, or enhancing data systems and AI models within Google Cloud.
* Stay current with emerging Google Cloud data and AI technologies, such as BigQuery ML, AutoML, and Vertex AI, and lead efforts to integrate new innovations into solutions for clients.
* Mentor and develop team members to enhance their skills in Google Cloud data and AI technologies, while providing leadership and training on both data pipeline optimization and AI/ML best practices.
What You'll Bring
* Proven experience as a Cloud Data and AI Engineer or similar role, with hands-on experience in Google Cloud tools and services (e.g., BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub, etc.).
* Strong knowledge of dataengineering concepts, such as ETL processes, data warehousing, data modeling, and data governance.
* Proficiency in AI engineering, including experience with machine learning models, model training, and MLOps pipelines using tools like Vertex AI, BigQuery ML, and AutoML.
* Strong problem-solving and decision-making skills, particularly with large-scale data systems and AI model deployment.
* Strong communication and collaboration skills to work with cross-functional teams, including data scientists, business stakeholders, and IT teams, bridging dataengineering and AI efforts.
* Experience with agile methodologies and project management tools in the context of Google Cloud data and AI projects.
* Ability to work in a fast-paced environment, managing multiple Google Cloud data and AI engineering projects simultaneously.
* Knowledge of security and compliance best practices as they relate to data and AI solutions on Google Cloud.
* Google Cloud certifications (e.g., Professional DataEngineer, Professional DatabaseEngineer, Professional Machine Learning Engineer) or willingness to obtain certification within a defined timeframe.
About Us
Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance.
Slalom is committed to fair and equitable compensation practices. For this position the target base salaries are listed below. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The target salary pay range is subject to change and may be modified at any time.
East Bay, San Francisco, Silicon Valley:
* Consultant $114,000-$171,000
* Senior Consultant: $131,000-$196,500
* Principal: $145,000-$217,500
San Diego, Los Angeles, Orange County, Seattle, Houston, New Jersey, New York City, Westchester, Boston, Washington DC:
* Consultant $105,000-$157,500
* Senior Consultant: $120,000-$180,000
* Principal: $133,000-$199,500
All other locations:
* Consultant: $96,000-$144,000
* Senior Consultant: $110,000-$165,000
* Principal: $122,000-$183,000
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
We are accepting applications until 12/31.
#LI-FB1
$145k-217.5k yearly 12d ago
Azure Data Architect - C60720 9.5 Beaverton, OR
CapB Infotek
Data engineer job in Beaverton, OR
For one of our long-term multiyear projects, we are looking for an Azure Data Architect out of Beaverton, OR. Skills: At least 10+ years of experience in designing, architecting and implementing large scale data processing/data storage/data distribution systems particularly in Azure
Most recent 2+ years of experience in delivering large scale Azure projects
At least 5+ years real time and streaming experience in Azure based data solutions
At least 3+ years in presales and demo using Azure data services including ADW and ADLS
At least 5+ years of demonstrated experience at least in the most recent 2+ years of designing and delivering solutions using Cortana Intelligence suite of analytics services part of Microsoft Azure including Azure Data Lake Analytics, Azure Data Warehouse, Streaming Analytics, Data Catalog
Should have experience designing service management, orchestration, monitoring and management requirements of cloud platform.
At least 3+ years of experience in migrating large volumes of data using standard Azure automation tools from on premise and cloud infrastructure to Azure
Ability to produce high quality work products under pressure and within deadlines with specific references
VERY strong communication, solution, and client facing skills especially non-technical business users
At least 5+ years of working with large multi-vendor environment with multiple teams and people as a part of the project
At least 2+ year of working with Power BI
At least 5+ years of working with a complex Big Data environment using Microsoft tools
5+ years of experience with Team Foundation Server/JIRA/GitHub and other code management tool sets
Preferred Skills and Education:
Master's degree in Computer Science or related field
Certification in Azure platform
$92k-128k yearly est. 60d+ ago
Senior Data Architect
Wizeline 4.3
Data engineer job in Beaverton, OR
Senior Data Architect The Company
Wizeline is a global digital services company helping mid-size to Fortune 500 companies build, scale, and deliver high-quality digital products and services. We thrive in solving our customer's challenges through human-centered experiences, digital core modernization, and intelligence everywhere (AI/ML and data). We help them succeed in building digital capabilities that bring technology to the core of their business.
Your Day-to-Day
IMPORTANT: This position is based in Beaverton, Oregon, and requires office attendance.
Partner with information architecture to build logical/canonical, dimensional and physical data models for Databricks, Snowflake and Hive bigdata implementations to support data, BI Analytics and AI/ML products aligned with conceptual data models
Work with product teams to define data sources by partnering with information and enterprise architecture.
Work with product teams to define Key Performance Indicators and supporting data attributes and business rules
Work with product teams to ensure data meets business requirements
Optimize data designs and rationalize data objects across data products to reduce data redundancies and duplication
Analyze existing data stores and interfaces to support current state and future data models
Work with engineering squads to create physical data designs and business rule implementation
Partner with enterprise data management and strategy on business glossary implementation and governance and stewardship
Work with report developers to align dataset design with report design
Partner with information architecture to provide feedback on conceptual models and review logical and physical data models
Partner with information architecture on naming standards
Provide data consistency across multiple scrum teams within a capability
Facilitate data requirements elicitation and modeling sessions including use of interview, document analysis, workshops etc.
Provide guidance and partner with squads by applying knowledge of data design
Assist with the development and enforcement of data modeling standards
Participate in data definition and data management reviews
Partner with source system on new initiatives and enhancements
Interface with transformational initiatives
Partner with engineering and information architect on integrated/aggregated layer guidelines
Establish and maintain processes to support and grow data modelling practices
Are You a Fit?
Sounds awesome, right? Now, let's make sure you're a good fit for the role.
Must-have Skills
Bachelor's degree in Business, Computer Science or equivalent
7+ years of experience working in technology, preferably in BI and Analytics
5 years' experience in Data Modelling and design
5 years of experience in data analysis, business analysis or system analysis
Experience using CA Erwin or Hackolade other similar modelling tool
Strong knowledge of relational and dimensional data modelling concepts
Strong knowledge of data architecture and database structure
Strong data analysis skills
Strong SQL skills
Experience with big data and cloud with Databricks & Snowflake is preferred
Experience, interest and adaptability to working in an Agile delivery environment.
Ability to work in a fast paced environment where change is a constant
Ability to handle ambiguous requirements
Exceptional interpersonal and communication skills (written and verbal)
Self-motivated and collaborative
About Us
Wizeline prioritizes a culture of diversity and development for its nearly 2,000 person team spread across the globe. We believe great technology comes from a mix of talents and perspectives. Our core values of ownership, innovation, community, and inclusivity are central to our work. Wizeline is invested in its employees' growth, offering opportunities to create personalized career paths and develop in-demand skills. We even have a free education program, Wizeline Academy, to help both employees and the broader community upskill in tech.
Apply now!
$128k-169k yearly est. Auto-Apply 60d+ ago
Data Architect
Feed The Children 4.1
Data engineer job in Portland, OR
At Feed the Children, we recognize the value of outstanding people, and we are looking for compassionate changemakers to join our team. We pride ourselves on cultivating a collaborative workplace where employees experience productive and rewarding employment and feel engaged in our mission to end childhood hunger. Our passionate team shares a deep sense of purpose, and we dream big to solve complex problems and create positive impact in communities around the world.
Feed the Children is recognized by Candid with its Platinum Seal of Transparency and is accredited by the BBB Wise Giving Alliance. The organization has received a 4-star rating from Charity Navigator and is consistently recognized on the Forbes Top 100 Charities list.
We are currently in search of a Data Architect to join our Information Technology team! The Data Architect is responsible for designing and managing Feed the Children's modern cloud data infrastructure, with ownership of Microsoft Fabric and Azure, and enablement of Purview, Power Platform, Copilot, and Dynamics 365 integration. This role will lead the stand up and evolution of Feed the Children's Fabric platform and serve as its primary Product Owner, ensuring it is scalable, governed, secure, and AI-ready. The Data Architect will also collaborate closely with ERP and CRM teams, and coordinate with Data Governance leadership to align data architecture with Microsoft Purview and governance policies. The ideal candidate combines deep technical expertise with strategic thinking, strong collaboration, and leadership skills. This position will report directly to the Vice President of Business Intelligence.
Salary range: $115K-$120K (commensurate with experience)
Note: Although our corporate office in located in Oklahoma City, OK qualified candidates are being considered nationwide for this remote opportunity.
Job Requirements:
Education
Bachelor's or Master's degree in Information Technology, Computer Science, Data Science, Information Systems, or a related technical field, preferred.
Experience
7+ years' of related professional experience; 3+ years' experience in data architecture or enterprise dataengineering; and 2+ years' experience working with Azure data services (e.g., Data Factory, Synapse, Data Lake).
Deep understanding of: Microsoft Fabric, Dynamics 365 (especially SCM and CS), Power Platform, Power BI, Dataverse, Purview, SQL, Python, Spark, DAX, ETL/ELT processes, data modeling, Copilot, and Git and CI/CD practices.
Familiarity with data retention, auditability, and regulatory compliance (e.g., GDPR, HIPAA, CCPA).
Experience with Agile/Scrum methodologies, and Product Ownership.
Any combination of education, training and experience which provides the required knowledge, skills and abilities to perform the essential functions of this job may be considered.
Bonus Qualifications:
* Experience working in a mission-driven enterprise, especially in global health and development, with complex supply chain, community impact, donation, and volunteering programs.
Licenses and Certifications
Credentials in relevant Microsoft technologies a plus.
Essential Functions:
Data Strategy & Architecture:
* Define and maintain the enterprise data architecture roadmap, aligning with business goals and future scalability.
* Design and implement robust architecture in Microsoft Fabric, including lakehouses, warehouses, Notebooks, pipelines, and semantic models.
* Lead the development of data models, ETL/ELT processes and data lake/warehouse structures to support analytics and AI.
* Manage all Azure resources, and oversee digital transformation necessary between Azure and Fabric, and from server to cloud architecture.
* Lead the establishment and documentation of technology direction and standards for data platforms, involving all aspects of information access and retrieval, integration, middleware translators, utilities, tools, and languages for use by information technology groups.
* Establish and enforce data architecture standards, including modeling conventions, naming standards, and documentation practices.
* Continuously assess and optimize data architecture to improve quality, performance, scalability, cost management, and efficiency.
Data Integration & Interoperability:
* Ensure seamless integration of data and data flows from various sources, including ERP systems, CRM systems, DHIS2, external datasets and APIs, and other business applications, with the support of data management and governance leads.
Data Governance & Management:
* Following the direction of enterprise data management and governance councils, implement processes and tools to ensure high data quality and consistency across the organization.
* Align with governance policies established in Microsoft Purview to govern, protect, and manage Feed the Children's data estate, ensuring compliance and risk management.
* Following the direction of enterprise data management councils, implement Master Data Management policies across data architecture to create a common view of master data and provide a centralized mechanism for its aggregation, cleansing, transformation, augmentation, validation, syndication, and access.
Managed File Transfer:
* Support Managed File Transfer processes, in accordance with security and data governance principles.
AI & Copilot Enablement:
* In collaboration with AI developers and leaders, enable data architecture to support AI agent development and Copilot experiences.
* Support the integration of AI agents, prompt engineering, and Azure OpenAI services into data workflows.
Collaboration:
* Work across all departments and business leaders to understand data needs and provide tailored, scalable solutions.
* Collaborate with developers, analysts, and contractors to ensure alignment with architectural standards and business goals.
* Participate in business intelligence and analytics initiatives, ensuring data solutions meet stakeholder needs.
Security:
* Implement best practices for data protection and privacy, ensuring compliance with regulatory requirements (e.g., GDPR, HIPAA).
* Collaborate with security and compliance teams to align data practices with enterprise risk policies.
* Implement role-based access control (RBAC) and encryption.
Project Management:
* Lead and manage architecture-related projects, including timelines, budgets, and resource allocation.
* Provide architectural oversight throughout project cycles to ensure development of efficient data systems utilizing established standards, procedures and methodologies.
* Manage projects implemented in collaboration with/by vendors and partners, including managing contractual and project management agreements and compliance.
Reporting & Communication:
* Generate and present reports on data usage, performance, and compliance.
* Communicate architectural decisions, roadmaps, and standards to technical and non-technical audiences.
Establish an environment of high performance and continuous improvement that values learning, a commitment to quality, welcomes and encourages collaboration, and fosters both intra and inter-departmental dialogue and respect.
Model the type and level of behavior, professionalism and leadership that is in accordance with the values of the organization.
Perform other related duties as required.
About Feed the Children:
As a leading anti-hunger organization, Feed the Children is committed to ending childhood hunger. We provide children and families in the U.S. and around the world with the food and essentials kids need to grow and thrive.
Through our programs and partnerships, we feed children today while helping their families and communities build resilient futures. In addition to food, we distribute household and personal care items across the United States to help parents and caregivers maintain stable, food-secure households. Internationally, we expand access to nutritious meals, safe water, improved hygiene, and training in sustainable living. Responsible stewards of our resources, we are driven to pursue innovative, holistic, and child-focused solutions to the complex challenges of hunger, food insecurity, and poverty.
For children everywhere, we believe that having enough to eat is a fundamental right.
Our Values:
We are driven by a shared sense of PURPOSE. At Feed the Children, our commitment to the mission is at the heart of what we do and fuels our collective impact in the communities where we serve.
We cannot achieve our bold vision without our talented PEOPLE . We are passionate about fostering a best-in-class workforce that is engaged, respected, and empowered to deliver results.
We believe in CURIOSITY and continued learning. Success requires a culture of discovery, curiosity and continued learning to expand our knowledge, seek new perspectives and challenge the status quo.
We know COLLABORATION is the only way to end childhood hunger. We cannot succeed alone. It will take all of us - our employees, donors, partners, volunteers - working together to accomplish our ambitious goals.
We DREAM big . When we work together, we collectively reimagine what is possible. We dream big to solve complex problems and create deep impact in communities around the world.
We VALUE every donor. We respect our donors' intentions and promote responsible stewardship of the resources they entrust to us.
Join Feed the Children and help create a world where no child goes to bed hungry.
In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification form upon hire.
Feed the Children is an equal opportunity employer. All qualified candidates will receive consideration for positions without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, political affiliation, pregnancy, military and/or veterans' status, genetic characteristics, marital status or any other considerations made unlawful by applicable state, federal, or local law. Feed the Children welcomes and encourages applications from persons with physical and mental disabilities and will make every effort to reasonably accommodate the needs of those persons. Additionally, Feed the Children strives to provide an environment free from sexual exploitation and abuse and harassment in all places where relief and development programs are implemented. Feed the Children expects its employees to maintain high ethical standards, protect organizational integrity and reputation, and ensure that Feed the Children work is carried out in honest and fair methods, in alignment with the Feed the Children safeguarding and associated policies.
$115k-120k yearly 16d ago
BigData Engineer / Architect
Nitor Infotech
Data engineer job in Portland, OR
The hunt is for a strong Big Data Professional, a team player with the ability to manage effective relationships with a wide range of stakeholders (customers & team members alike). Incumbent will demonstrate personal commitment and accountability to ensure standards are continuously sustained and improved both within the internal teams, and with partner organizations and suppliers.
Role: Big DataEngineer
Location: Portland OR.
Duration: Full Time
Skill Matrix:
Map Reduce - Required
Apache Spark - Required
Informatica PowerCenter - Required
Hive - Required
Apache Hadoop - Required
Core Java / Python - Highly Desired
Healthcare Domain Experience - Highly Desired
Job Description
Responsibilities and Duties
Participate in technical planning & requirements gathering phases including architectural design, coding, testing, troubleshooting, and documenting big data-oriented software applications.
Responsible for the ingestion, maintenance, improvement, cleaning, and manipulation of data in the business's operational and analytics databases, and troubleshoots any existent issues.
Implementation, troubleshooting, and optimization distributed solutions based on modern big data technologies like Hive, Hadoop, Spark, Elastic Search, Storm, Kafka, etc. in both an on premise and cloud deployment model to solve large scale processing problems
Design, enhance and implement ETL/data ingestion platform on the cloud.
Strong Data Warehousing skills, including: Data clean-up, ETL, ELT and handling scalability issues for enterprise level data warehouse
Capable of investigating, familiarizing and mastering new data sets quickly
Strong troubleshooting and problem-solving skills in large data environment
Experience with building data platform on cloud (AWS or Azure)
Experience in using Python, Java or any other language to solving data problems
Experience in implementing SDLC best practices and Agile methods.
Qualifications
Required Skills:
Data architecture/ Big Data/ ETL environment
Experience with ETL design using tools Informatica, Talend, Oracle Data Integrator (ODI), Dell Boomi or equivalent
Big Data & Analytics solutions Hadoop, Pig, Hive, Spark, Spark SQL Storm, AWS (EMR, Redshift, S3, etc.)/Azure (HDInsight, Data Lake Design)
Building and managing hosted big data architecture, toolkit familiarity in: Hadoop with Oozy, Sqoop, Pig, Hive, HBase, Avro, Parquet, Spark, NiFi
Foundational data management concepts - RDM and MDM
Experience in working with JIRA/Git/Bitbucket/JUNIT and other code management toolsets
Strong hands-on knowledge of/using solutioning languages like: Java, Scala, Python - any one is fine
Healthcare Domain knowledge
Required Experience, Skills and Qualifications
Qualifications:
Bachelor's Degree with a minimum of 6 to 9 + year's relevant experience or equivalent.
Extensive experience in data architecture/Big Data/ ETL environment.
Additional Information
All your information will be kept confidential according to EEO guidelines.
How much does a data engineer earn in Vancouver, WA?
The average data engineer in Vancouver, WA earns between $79,000 and $149,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Vancouver, WA
$108,000
What are the biggest employers of Data Engineers in Vancouver, WA?
The biggest employers of Data Engineers in Vancouver, WA are: