Post job

Data engineer jobs in Pasco, WA

- 5,009 jobs
All
Data Engineer
Requirements Engineer
Software Engineer
Data Scientist
Computer Software Engineer
Senior Software Engineer
Data Architect
Business Owner/Engineer
  • Sr. Databricks Data Engineer

    Artech L.L.C 3.4company rating

    Data engineer job in Portland, OR

    We are seeking a highly skilled Databricks Data Engineer with a minimum of 10 years of total experience, including strong expertise in the retail industry. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines and architectures to support advanced analytics and business intelligence initiatives. This role requires proficiency in Python, SQL, cloud platforms, and ETL tools within a retail-focused data ecosystem. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Databricks and Snowflake. Work with Python libraries such as Pandas, NumPy, PySpark, PyOdbc, PyMsSQL, Requests, Boto3, SimpleSalesforce, and JSON for efficient data processing. Optimize and enhance SQL queries, stored procedures, triggers, and schema designs for RDBMS (MSSQL/MySQL) and NoSQL (DynamoDB/MongoDB/Redis) databases. Develop and manage REST APIs to integrate various data sources and applications. Implement AWS cloud solutions using AWS Data Exchange, Athena, Cloud Formation, Lambda, S3, AWS Console, IAM, STS, EC2, and EMR. Utilize ETL tools such as Apache Airflow, AWS Glue, Azure Data Factory, Talend, and Alteryx to orchestrate and automate data workflows. Work with Hadoop and Hive for big data processing and analysis. Collaborate with cross-functional teams to understand business needs and develop efficient data solutions that drive decision-making in the retail domain. Ensure data quality, governance, and security across all data assets and pipelines. Required Qualifications: 10+ years of total experience in data engineering and data processing. 6+ years of hands-on experience in Python programming, specifically for data processing and analytics. 4+ years of experience working with Databricks and Snowflake. 4+ years of expertise in SQL development, performance tuning, and RDBMS/NoSQL databases. 4+ years of experience in designing and managing REST APIs. 2+ years of working experience in AWS data services. 2+ years of hands-on experience with ETL tools like Apache Airflow, AWS Glue, Azure Data Factory, Talend, or Alteryx. 1+ year experience with Hadoop and Hive. Strong understanding of retail industry data needs and best practices. Excellent problem-solving, analytical, and communication skills. Preferred Qualifications: Experience with real-time data processing and streaming technologies. Familiarity with machine learning and AI-driven analytics. Certifications in Databricks, AWS, or Snowflake. This is an exciting opportunity to work on cutting-edge data engineering solutions in a fast-paced retail environment. If you are passionate about leveraging data to drive business success and innovation, we encourage you to apply!
    $99k-141k yearly est. 5d ago
  • Data Engineer II: 25-07190

    Akraya, Inc. 4.0company rating

    Data engineer job in Seattle, WA

    Process Skills: Data Pipeline (Proficient), SQL (Expert), Python(Proficient), ETL (Expert), QuickSight (Intermediate) Contract Type: W2 Only Duration: 5+ months (high possibility of conversion) Pay Range: $65.00 - $70.00 Per Hour #LP Summary: Join the Sustain.AI team as a Data Engineer to lead the pivotal migration of sustainability data to modern AWS infrastructure, fostering a net-zero future. This role involves transforming and automating sustainability workflows into data-driven processes, significantly impacting our environmental goals. Your work will directly contribute to reducing the carbon footprint by creating an automated, scalable data infrastructure and ensuring precise data quality and consistency. Key Responsibilities: Migrate ETL jobs from legacy to AWS, ensuring no operational disruptions. Set up and manage AWS data services (Redshift, Glue) and orchestrate workflows. Transform existing workflows into scalable, efficient AWS pipelines with robust validation. Collaborate with various teams to understand and fulfill data requirements for sustainability metrics. Document new architecture, implement data quality checks, and communicate with stakeholders on progress and challenges. Must-Have Skills: Advanced proficiency in ETL Pipeline and AWS data services (Redshift, S3, Glue). Expertise in SQL and experience with Python or Spark for data transformation. Proven experience in overseeing data migrations with minimal disruption. Industry Experience Required: Experience in sustainability data, carbon accounting, or environmental metrics is highly preferred. Familiarity with large-scale data infrastructure projects in complex enterprise environments is essential. ABOUT AKRAYA Akraya is an award-winning IT staffing firm consistently recognized for our commitment to excellence and a thriving work environment. Most recently, we were recognized Inc's Best Workplaces 2024 and Silicon Valley's Best Places to Work by the San Francisco Business Journal (2024) and Glassdoor's Best Places to Work (2023 & 2022)! Industry Leaders in IT Staffing As staffing solutions providers for Fortune 100 companies, Akraya's industry recognitions solidify our leadership position in the IT staffing space. We don't just connect you with great jobs, we connect you with a workplace that inspires! Join Akraya Today! Let us lead you to your dream career and experience the Akraya difference. Browse our open positions and join our team!
    $65-70 hourly 4d ago
  • Staff Data Engineer

    Eton Solution 3.7company rating

    Data engineer job in Bellevue, WA

    *Immigration sponsorship is not available in this role* We are looking for an experienced Data Engineer (8+ years of experience) with deep expertise in Flink SQL to join our engineering team. This role is ideal for someone who thrives on building robust real-time data processing pipelines and has hands-on experience designing and optimizing Flink SQL jobs in a production environment. You'll work closely with data engineers, platform teams, and product stakeholders to create scalable, low-latency data solutions that power intelligent applications and dashboards. ⸻ Key Responsibilities: • Design, develop, and maintain real-time streaming data pipelines using Apache Flink SQL. • Collaborate with platform engineers to scale and optimize Flink jobs for performance and reliability. • Build reusable data transformation logic and deploy to production-grade Flink clusters. • Ensure high availability and correctness of real-time data pipelines. • Work with product and analytics teams to understand requirements and translate them into Flink SQL jobs. • Monitor and troubleshoot job failures, backpressure, and latency issues. • Contribute to internal tooling and libraries that improve Flink developer productivity. Required Qualifications: • Deep hands-on experience with Flink SQL and the Apache Flink ecosystem. • Strong understanding of event time vs processing time semantics, watermarks, and state management. • 3+ years of experience in data engineering, with strong focus on real-time/streaming data. • Experience writing complex Flink SQL queries, UDFs, and windowing operations. • Proficiency in working with streaming data formats such as Avro, Protobuf, or JSON. • Experience with messaging systems like Apache Kafka or Pulsar. • Familiarity with containerized deployments (Docker, Kubernetes) and CI/CD pipelines. • Solid understanding of distributed system design and performance optimization. Nice to Have: • Experience with other stream processing frameworks (e.g., Spark Structured Streaming, Kafka Streams). • Familiarity with cloud-native data stacks (AWS Kinesis, GCP Pub/Sub, Azure Event Hub). • Experience in building internal tooling for observability or schema evolution. • Prior contributions to the Apache Flink community or similar open-source projects. Why Join Us: • Work on cutting-edge real-time data infrastructure that powers critical business use cases. • Be part of a high-caliber engineering team with a culture of autonomy and excellence. • Flexible working arrangements with competitive compensation.
    $95k-134k yearly est. 2d ago
  • Azure Data Engineer

    Akkodis

    Data engineer job in Acme, WA

    Akkodis is seeking a Senior Azure Data Engineer for a 6 Months contract in Acme, WA - Onsite Rate Range: $56.00/HR. - $60.00/HR.; The rate may be negotiable based on experience, education, geographic location, and other factors. Job Summary We are seeking an experienced Senior Azure Data Engineer with 6-8+ years of hands-on experience in Azure Data Factory (ADF) to design, build, and optimize scalable data integration and analytics solutions. The ideal candidate will have strong expertise in ETL/ELT pipelines, data modeling, Azure data services, and delivering enterprise-grade solutions that support reporting and analytics platforms such as Power BI. Key Responsibilities Experience in designing and developing solutions using Microsoft technologies. Strong ability to deliver highly impactful solutions to problems using the latest technology. Experience of all phases of the Software Development Life Cycle (SDLC) using Agile Scrum and Waterfall models. Build data Models for sales and inventory datasets using dimensional modeling principles, integrated with ADF pipelines and Power BI Dashboards. Experience of designing and developing enterprise applications using MVC (Model View Controller) Architecture. Experienced of Data Extraction, Transformation, and Loading (ETL) between Homogenous and Heterogeneous Systems using SSIS, Azure Data Lake, Azure Data Factory, Azure Fabric, and Databricks. If you are interested in this role, then please click APPLY NOW. For other opportunities available at Akkodis, or any questions, feel free to contact me at ****************************. Equal Opportunity Employer/Veterans/Disabled Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, an EAP program, commuter benefits, and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client. To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit ****************************************** The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable: · The California Fair Chance Act · Los Angeles City Fair Chance Ordinance · Los Angeles County Fair Chance Ordinance for Employers · San Francisco Fair Chance Ordinance
    $56-60 hourly 2d ago
  • Data Engineer

    OSI Engineering 4.6company rating

    Data engineer job in Seattle, WA

    A globally leading consumer device company based in Seattle, WA is looking for a ML Data Pipeline Engineer to join their dynamic team! Job Responsibilities: • Assist team in building, maintaining, and running essential ML Data Pipelines. Minimum Requirements: 5 year+ experience Experience with cloud platforms (AWS, Azure, GCP) Strong Data quality focus - implementing data validation, cleansing, transformation Strong proficiency in Python and related data science libraries (e.g., pandas, NumPy). Experience with distributed computing Experience dealing with image-type and array-type data Able to automate repetitive tasks and processes Experience with scripting and automation tools Able to execute tasks independently and creatively Languages: Fluent in English or Farsi Type: Contract Duration: 12 months with extension Work Location: Seattle, WA (hybrid) Pay Rate Range: $40.00 - $55.00 (DOE)
    $40-55 hourly 2d ago
  • AWS Data Engineer

    Tata Consultancy Services 4.3company rating

    Data engineer job in Seattle, WA

    Must Have Technical/Functional Skills: We are seeking an experienced AWS Data Engineer to join our data team and play a crucial role in designing, implementing, and maintaining scalable data infrastructure on Amazon Web Services (AWS). The ideal candidate has a strong background in data engineering, with a focus on cloud-based solutions, and is proficient in leveraging AWS services to build and optimize data pipelines, data lakes, and ETL processes. You will work closely with data scientists, analysts, and stakeholders to ensure data availability, reliability, and security for our data-driven applications. Roles & Responsibilities: Key Responsibilities: • Design and Development: Design, develop, and implement data pipelines using AWS services such as AWS Glue, Lambda, S3, Kinesis, and Redshift to process large-scale data. • ETL Processes: Build and maintain robust ETL processes for efficient data extraction, transformation, and loading, ensuring data quality and integrity across systems. • Data Warehousing: Design and manage data warehousing solutions on AWS, particularly with Redshift, for optimized storage, querying, and analysis of structured and semi-structured data. • Data Lake Management: Implement and manage scalable data lake solutions using AWS S3, Glue, and related services to support structured, unstructured, and streaming data. • Data Security: Implement data security best practices on AWS, including access control, encryption, and compliance with data privacy regulations. • Optimization and Monitoring: Optimize data workflows and storage solutions for cost and performance. Set up monitoring, logging, and alerting for data pipelines and infrastructure health. • Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand data needs and deliver data solutions aligned with business goals. • Documentation: Create and maintain documentation for data infrastructure, data pipelines, and ETL processes to support internal knowledge sharing and compliance. Base Salary Range: $100,000 - $130,000 per annum TCS Employee Benefits Summary: Discretionary Annual Incentive. Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. Family Support: Maternal & Parental Leaves. Insurance Options: Auto & Home Insurance, Identity Theft Protection. Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement. Time Off: Vacation, Time Off, Sick Leave & Holidays. Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
    $100k-130k yearly 3d ago
  • Local to Portland, OR: Data Engineer

    It Motives

    Data engineer job in Portland, OR

    No C2C or Sponsorship Data Engineer: Want to work within a local government school system and really make a difference? Our client is looking for an experience Data Engineer who can design, develop, and support complex data integrations, pipelines, models, and cloud-based data systems within the district's enterprise data environment. You will serve as a technical expert in Snowflake engineering, ELT/ETL orchestration, and data quality automation. Does this sound like you, then please apply! We value diversity in the workplace and encourage women, minorities, and veterans to apply. Thank you! Location: Portland, OR Type: 6 month contract Position details: A candidate for this position is expected to engineer, maintain, and optimize data systems and integrations that support equity-centered, data-informed decision-making. Develop, manage, and enhance secure, reliable, and scalable data pipelines and Snowflake-based data platforms that empower educators, central office staff, and school leaders. Ensure high-quality data availability, improve data accuracy, and apply modern data engineering practices that strengthen instructional and operational outcomes. Knowledge of: Advanced principles of data engineering, Snowflake architecture, data warehousing, and cloud-based data systems. SQL, dbt, ELT/ETL concepts, data modeling techniques, and API-driven integrations. Cloud environments (AWS/Azure) and data tools (Python, Workato, Airflow, Git). Data governance, metadata management, role-based access control, and FERPA requirements. Technical documentation, and Agile development practices. Equity-centered data practices and culturally responsive communication. Ability to: Design, build, and maintain efficient, reliable data pipelines and Snowflake workloads. Analyze complex data structures, identify system issues, and implement solutions. Collaborate with cross-functional teams and communicate technical concepts to nontechnical audiences. Ensure data accuracy, security, and compliance with district and legal requirements. Support the District's Racial Educational Equity Policy and contribute to an inclusive work environment. Use programming languages and tools (Python, SQL, dbt, Git, etc.) to develop data solutions. Work independently, maintain confidentiality, and deliver high-quality customer service. Education: Bachelor's degree in computer science, information science, data engineering, or a closely related field. A master's degree in a related discipline is desirable. Experience: Three (3) or more years of professional experience in data engineering, data warehousing, database development, or cloud data platform operations, including experience with Snowflake or a closely related enterprise data warehouse technology. Experience with public-sector or K-12 education environments is preferred. Any combination of education and experience that provides the required knowledge and abilities may be considered. Representative duties: Design, build, and maintain scalable ELT/ETL pipelines into Snowflake, sourcing data from SIS, HR, Finance, transportation, assessment, vendor platforms, and other enterprise systems hosted on Microsoft SQL Servers. Develop and maintain Snowflake data models, schemas, tasks, streams, stored procedures, and transformation logic to support analytics, reporting, and regulatory needs. Implement and monitor data quality frameworks, validation rules, automated tests, and observability tools to ensure accuracy, completeness, and reliability of district data. Collaborate with data architects, analysts, software developers, and cross-departmental stakeholders to translate business requirements into scalable data engineering solutions. Optimize Snowflake performance, including warehouse sizing, query tuning, clustering, and cost management to ensure efficient and cost-effective operations. Manage integrations using tools such as dbt, Workato, SQL, Python scripts, APIs, and cloud-native services; monitor workflows and resolve data pipeline issues. Ensure adherence to data governance policies, privacy requirements, and security standards including FERPA, role-based access, and metadata documentation. Support the development and implementation of districtwide data strategies, standards, and best practices. Analyze complex data issues, troubleshoot system errors, and perform root-cause analysis to identify long-term solutions. Demonstrate a commitment to the Racial Equity and Social Justice Commitment. Framework in daily practices and data governance decisions. Maintain current knowledge of Snowflake capabilities, cloud data engineering trends, emerging technologies, and industry best practices; participate in professional development
    $84k-118k yearly est. 4d ago
  • ERP/MRP Data Architect

    Proliance Consulting

    Data engineer job in Issaquah, WA

    Our client is staffing a Data Architect to define the data strategy and architecture for a new enterprise, proprietary ERP/MRP. This person will be responsible for cross-platform oversight and governance of the holistic data landscape, which will be developed iteratively through many phases of feature and architectural foundation development. Location: Issaquah, WA Associate Vendors: We are accepting applications from candidates who are currently authorized to work in the US for any employer without sponsorship. Role & Responsibilities Drives the conceptual and logical data architecture strategy in collaboration with enterprise architecture. Defines and implements data migration strategies and solutions Advises teams on translating business needs into long-term data architecture solutions. Partners with feature and architectural foundation development teams to translate data architectures into physical models and implementations. Builds, optimizes, and maintains logical (curated) data models. Works with Data Engineering and Data Platform teams to conduct ongoing performance optimizations in the data model. Assists in defining data integration, storage, replication, and transformation protocols for the data layer of technical solutions supported by enterprise architecture. Creates data quality rules and validations to monitor critical business metrics/KPIs. Develops a thorough knowledge and understanding of cross system data flows as well as an enterprise view of Costco's data landscape. Maintains detailed documentation (i.e. data flow diagrams, source to target mappings, etc.). Required Qualifications Experience in implementing packaged or custom ERP/MRP solutions in the Retail industry, focusing on data architecture and migration 8+ years' experience in data architecture design, implementation, and maintenance of complex datasets. Experience working with data modeling tools (Erwin preferred). Experience architecting and delivering data engineering/integration pipelines. Experience designing and developing performance-optimized data models. Proficient in SQL. Desired Qualifications Exposure to the retail industry. Understanding of CI/CD pipelines, Azure DevOps, and GitHub Actions. Experience with Git for code storage and collaboration. Architectural level experience in information privacy, data compliance, and risk management. Excellent verbal and written communication skills. Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail About Proliance: Proliance Consulting is a Seattle-based staffing firm specializing in IT and digital marketing roles. We connect top-tier talent with leading companies across the United States, offering contract, contract-to-hire, and direct placement opportunities. Our approach emphasizes building authentic relationships, ensuring transparent communication, and delivering tailored solutions that align with both candidate aspirations and client needs. Committed to fostering an inclusive and equitable workplace, we prioritize continuous learning and cultural awareness to support diverse professionals in thriving environments. Why Work for Us? At Proliance Consulting, we prioritize the well-being and satisfaction of our consultants, which is reflected in our strong reputation and 5-star reviews on Google. We offer a comprehensive benefits package that includes medical and vision coverage through Regence, with Proliance covering 50% of employee premiums and 25% for dependents. We also provide dental insurance through Delta Dental, a competitive 401(k) with matching options, and direct deposit for convenience. Our team members enjoy paid safe and sick time, paperless pay statements, and the opportunity to earn referral bonuses through our Employee Referral Program. We believe in supporting our people both professionally and personally-because when you thrive, we thrive. Proliance Consulting provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Proliance Consulting complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfers, leaves of absence, compensation, and training.
    $93k-129k yearly est. 1d ago
  • Business Intelligence Engineer

    Intelliswift-An LTTS Company

    Data engineer job in Seattle, WA

    Pay rate range - $55/hr. to $60/hr. on W2 Onsite role Must Have - Expert Python and SQL Visualization and development Required Skills - 5-7 years of experience working with large-scale complex datasets - Strong analytical mindset, ability to decompose business requirements into an analytical plan, and execute the plan to answer those business questions - Strong working knowledge of SQL - Background (academic or professional) in statistics, programming, and marketing - SAS experience a plus - Graduate degree in math/statistics, computer science or related field, or marketing is highly desirable. - Excellent communication skills, equally adept at working with engineers as well as business leaders Daily Schedule - Evaluation of the performance of program features and marketing content along measures of customer response, use, conversion, and retention - Statistical testing of A/B and multivariate experiments - Design, build and maintain metrics and reports on program health - Respond to ad hoc requests from business leaders to investigate critical aspects of customer behavior, e.g. how many customers use a given feature or fit a given profile, deep dive into unusual patterns, and exploratory data analysis - Employ data mining, model building, segmentation, and other analytical techniques to capture important trends in the customer base - Participate in strategic and tactical planning discussions About the role understanding customer behavior is paramount to our success in providing customers with convenient, fast free shipping in the US and international markets. As a Senior Business Intelligence Engineer, you will work with our world-class marketing and technology teams to ensure that we continue to delight our customers. You will meet with business owners to formulate key questions, leverage client's vast Data Warehouse to extract and analyze relevant data, and present your findings and recommendations to management in a way that is actionable
    $55-60 hourly 1d ago
  • Software Dev Engineer

    Collabera 4.5company rating

    Data engineer job in Redmond, WA

    Title: Software Dev Engineer Required Skills & Qualifications 4-10 years of experience in software development. Strong proficiency in Python and backend development (APIs, business logic, integrations). Experience with AWS Lambda, DynamoDB, and serverless architecture. Hands-on experience with React for frontend development. Proficient in scripting (Python, Bash, or similar). Experience working with databases: Preferred: DynamoDB Also accepted: SQL-based DBs or MongoDB Solid understanding of REST APIs, microservices, and cloud-based application design. Nice-to-Have Skills Experience with CI/CD pipelines (CodePipeline, GitHub Actions, Jenkins, etc.) Knowledge of infrastructure-as-code tools such as CloudFormation, AWS CDK, or other IaC frameworks. Familiarity with containerization (Docker) is a plus. Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: medical insurance, dental insurance, vision insurance, 401(k) retirement plan, life insurance, long-term disability insurance, short-term disability insurance, paid parking/public transportation, (paid time , paid sick and safe time , hours of paid vacation time, weeks of paid parental leave, paid holidays annually - As Applicable)
    $103k-141k yearly est. 2d ago
  • Kettle Engineer

    Soho Square Solutions

    Data engineer job in Seattle, WA

    Job Title: Kettle Engineer Type: Contract We're seeking a Kettle Engineer to design, operate, and continuously improve our Pentaho Data Integration (PDI/Kettle) platform and data movement processes that underpin business‑critical workflows. You will own end‑to‑end lifecycle management-from environment build and configuration to orchestration, monitoring, and support-partnering closely with application teams, production operations, and data stakeholders. The ideal candidate combines strong hands‑on Kettle expertise with solid SQL, automation, and production support practices in a fast‑moving, highly collaborative environment. Primary Responsibilities Platform ownership: Install, configure, harden, and upgrade Kettle/PDI components (e.g., Spoon) across dev/prod. Process engineering: Migrate, re-engineer, and optimize jobs and transformations for existing transformations. Reliability & support: Document all workflows including ownership and escalation protocols. Knowledge transfer with Automation and Application Support teams. Observability: Implement proactive monitoring, logging, and alerting for the Kettle platform including all dependent processes. Collaboration: Partner with application, data, and infrastructure teams to deliver improvements to existing designs Required Qualifications Kettle/PDI expertise: Experience installing and configuring a Kettle instance (server and client tools, repositories, parameters, security, and upgrades). Kettle/PDI expertise: Experience creating, maintaining, and supporting Kettle processes (jobs/transformations, error handling, recovery, and performance tuning). 4+ years hands‑on SQL (writing/diagnosing and optimizing queries). Strong communication skills for both technical and non‑technical audiences; effective at documenting and sharing knowledge. Preferred Qualifications Experience integrating Kettle with cloud platforms (AWS and/or Azure); familiarity with containers or Windows/Linux server administration. Exposure to monitoring/observability stacks (e.g., DataDog, CloudWatch, or similar). Scripting/automation for operations (Python, PowerShell, or Bash); experience with REST APIs within Kettle. Background in financial services or other regulated/mission‑critical environments. Key Outcomes (First 90 Days) Stand up or validate a hardened Kettle environment with baseline monitoring and runbooks. Migrate at least two high‑value Kettle workflows using shared templates and standardized error handling.
    $87k-126k yearly est. 2d ago
  • Azure& GCP Engineer

    Amroute

    Data engineer job in Seattle, WA

    Job Title: Azure& GCP Engineer Duration: 6 months We are seeking a highly skilled and experienced Azure and Google Cloud Engineer to join our team. The ideal candidate will be responsible for troubleshooting and managing Azure as well as Google Cloud solutions that drive business transformation. This role requires a deep understanding of Cloud services, strong architectural principles, and the ability to translate business requirements into scalable, reliable, and secure cloud solutions. Key Responsibilities: Technical Leadership: Lead and mentor a team of cloud engineers and developers, providing guidance on best practices and technical issues. Act as a subject matter expert for Azure, staying current with the latest services, tools, and best practices. o Develop comprehensive cloud architectures leveraging GCP services such as Compute Engine, Kubernetes Engine, BigQuery, Pub/Sub, Cloud Functions, and others. o Design scalable, secure, and cost-effective cloud solutions to meet business and technical requirements. Cloud Strategy and Roadmap: o Define a long-term cloud strategy for the organization, including migration plans, optimization, and governance frameworks. o Assess and recommend best practices for cloud-native and hybrid cloud solutions. Solution Implementation: o Implement CI/CD pipelines, monitoring, and infrastructure-as-code (e.g., Terraform, Cloud Deployment Manager). Collaboration and Leadership: o Work closely with development, operations, and business teams to understand requirements and provide technical guidance. o Mentor junior team members and foster a culture of continuous learning and innovation. Performance Optimization: o Optimize GCP services for performance, scalability, and cost efficiency. o Monitor and resolve issues related to cloud infrastructure, applications, and services. Documentation and Reporting: o Create and maintain technical documentation, architectural diagrams, and operational runbooks. o Provide regular updates and reports to stakeholders on project progress, risks, and outcomes. Required Skills and Qualifications: • Education: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). • Experience: o 7+ years of experience in cloud architecture and 4+ years specifically with GCP. o Proven expertise in designing and implementing large-scale cloud solutions. o Experience with application modernization using containers, microservices, and serverless architectures. • Technical Skills: o Proficiency in GCP services (e.g., BigQuery, Cloud Spanner, Kubernetes Engine, Cloud Run, Dataflow). o Strong experience with Infrastructure as Code (e.g., Terraform, Deployment Manager). o Knowledge of DevOps practices, CI/CD pipelines, and container orchestration tools. o Familiarity with databases (relational and NoSQL) and data pipelines. • Certifications: GCP certifications such as Professional Cloud Architect or Professional Data Engineer are highly preferred. Solution Development and Deployment: Oversee the deployment, management, and maintenance of cloud applications. Automate deployment and configuration management processes using Infrastructure as Code (IaC) tools such as ARM templates, Terraform, or Azure Bicep. Develop disaster recovery and business continuity plans for cloud services. Collaboration and Communication: Collaborate with cross-functional teams including development, operations, and security to ensure seamless integration and operation of cloud systems. Communicate effectively with stakeholders to understand business requirements and provide cloud solutions that meet their needs. Performance and Optimization: Monitor and optimize the performance of cloud systems to ensure they meet service level agreements (SLAs). Implement cost management strategies to optimize cloud spending. Security and Compliance: Ensure that all cloud solutions comply with security policies and industry regulations. Implement robust security measures, including identity and access management (IAM), network security, and data protection. Continuous Improvement: Drive continuous improvement initiatives to enhance the performance, reliability, and scalability of cloud solutions. Participate in architecture reviews and provide recommendations for improvements. Technical Skills: Extensive experience with Google and Microsoft Azure services including but not limited to Azure Virtual Machines, Azure App Services, Azure Functions, Azure Kubernetes Service (AKS), and Azure SQL Database. Proficiency in Azure networking, storage, and database services Strong skills in Infrastructure as Code (IaC) tools such as ARM templates, Terraform, or Azure Bicep. Experience with continuous integration/continuous deployment (CI/CD) pipelines using Azure DevOps or other similar tools. Deep understanding of cloud security best practices and tools including Azure Security Center, Azure Key Vault, and Azure Policy. Familiarity with compliance frameworks such as GDPR, HIPAA, and SOC 2. Proficiency in scripting languages like PowerShell, Python, or Bash for automation tasks. Experience with configuration management tools like Ansible or Chef is a plus. Qualifications: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Master's degree preferred. Experience: Minimum of 5 years of experience in cloud architecture with a focus on Microsoft Azure. Proven track record of designing and deploying large-scale cloud solutions. Certifications: Relevant Azure certifications such as Microsoft Certified: Azure Solutions Architect Expert, Microsoft Certified: Azure DevOps Engineer Expert, or similar.
    $87k-126k yearly est. 3d ago
  • PAM Platform Engineer

    Info Way Solutions 4.3company rating

    Data engineer job in Seattle, WA

    Job Title: BeyondTrust SME - PAM Platform Engineer Job Type: Contract (6-12 Months, Extendable) Experience: 7-12+ Years We are hiring a Contract BeyondTrust SME / PAM Platform Engineer to support an enterprise Privileged Access Management program. The consultant will lead implementation, configuration, automation, and operational support for BeyondTrust PAM solutions across hybrid environments. Responsibilities BeyondTrust Engineering (Primary) Install, configure, upgrade, and maintain BeyondTrust Password Safe, BeyondInsight, and Endpoint Privilege Management (EPM). Configure credential vaulting, password rotation, SSH key management, and privileged session monitoring. Onboard privileged accounts for servers, applications, DBs, and network devices. Create and enforce Windows/Linux privilege elevation policies. PAM Operations & Automation Build automated onboarding workflows using PowerShell / Python / REST APIs. Manage policy updates, access approvals, JIT access workflows, and audit reporting. Troubleshoot PAM failures, authentication issues, session proxy errors, and integration problems. Integration & Governance Integrate BeyondTrust with Active Directory, Azure AD, LDAP, SIEM, and ServiceNow. Ensure compliance with SOX, PCI, NIST, ISO requirements. Provide documentation, architecture diagrams, SOPs, and operational runbooks. Required Skills 3-5+ years hands-on with BeyondTrust (Password Safe, BeyondInsight, EPM). Strong understanding of PAM concepts: vaulting, rotation, session recording, privileged access workflows. Scripting for automation: PowerShell / Python. Strong Linux & Windows administration background. Experience with IAM, RBAC, MFA, Kerberos, SAML, OAuth. Integration experience with AD / Azure AD / LDAP. Nice to Have Experience with CyberArk, HashiCorp Vault, Thycotic, or Delinea. Cloud platform knowledge (AWS, Azure, GCP). Security certifications (Security+, CISSP, etc.).
    $97k-123k yearly est. 1d ago
  • C++ Software Engineer w/ radio frequency and signal processing

    Request Technology

    Data engineer job in Everett, WA

    NO SPOSORSHIP Sr. C++ Software Systems Engineer - radio frequency and signal processing SALARY: $165k - $205k plus 20% bonus LOCATION: EVERETT, WA 98204 - Must live within one hour drive to come into the office a couple times a month Strong radio frequency and signal processing background. You will develop radio frequency. CC++ extensive digital signal processing (DSP) and math background radio frequency (RF) windows networking and socket programming embedded software Solutions to ensure the efficient use of frequencies, long distance communications, monitoring and security communications intelligence applications, we improve communications and protect military forces and infrastructure around the world. This person will apply their strong radio frequency and signal processing background, and software development skills to meeting signal detection, identification, processing, geolocation and analysis challenges facing spectrum regulators, intelligence organizations and defense agencies around the globe. Perform QA testing and analysis of new hardware and software performance up to the system level. Develop automated QA test software and systems. Required Experience US Person or Permanent Resident Extensive experience in design, implementation and testing of complex realtime multithreaded software applications Extensive C/C++ software development experience (6+ years) Extensive Digital Signal Processing (DSP) and math background Radio Frequency (RF) theory and practice (propagation, antennas, receivers, signals, systems, etc.) RF Signals expertise, including signal modulation, demodulation, decoding and signal analysis techniques and tools Programming for Windows operating systems Networking and socket level programming Databases and database programming Ability to quickly learn and support a large existing C++ code base System QA testing, including developing and executing test plans and writing automated QA test programs Excellent communications skills Ability to write technical product documentation Preferred Knowledge, Skills, and Abilities SIGINT/COMINT/EW experience RF Direction Finding and Geolocation concepts, including AOA and TDOA Mapping concepts, standards, and programming Audio signal processing including analog and digital demodulation Drone signals and protocols (uplink and downlink including video) Experience operating commercial drones Full Motion Video (FMV) systems, including STANAG 4609, KLV Metadata, MPEG-2 Transport Stream, H.264/265 encoding Programming expertise: Highly proficient in C/C++ Multithreaded realtime processing Programming with Qt Programming in Python Embedded programming Realtime hardware control and data acquisition High performance graphics GUI design and programming Networking and socket level programming Databases and database programming (incl. SQL) XML and XML programming JSON and JSON programming API programming (developing and using) Software licensing AI concepts and programming Tools: RF Measurement equipment (VSA/spectrum analyzers, signal generators, and other electronic test equipment) Windows OS, including desktop, server and embedded variants Microsoft Visual Studio and TFS Qt Python Intel IPP InstallShield Postgres and Microsoft databases packages Experience with Visual Basic, MFC, C#, WPF/XAML and other Windows development tools/API's Linux OS 6+ years relevant work experience MSEE (or BSEE with extended relevant work experience) with emphasis on RF communication systems, Digital Signal Processing, and software
    $165k-205k yearly 1d ago
  • Android OS Software Engineer

    Insight Global

    Data engineer job in Redmond, WA

    Client: Fortune 500 Company Title: Android OS Software Engineer Duration: 12 month + 12 month extensions Pay Rate: $80-$85hr We're seeking an Android OS Software Engineer to build and evolve the Android stack across multiple internal platforms. You'll help platformize research efforts, developing core OS components, services, and display/graphics pathways that scale across devices. What you'll do Develop in the Android stack (frameworks, services, HALs, kernel where needed) across multiple platforms. Contribute to platformization: unify, harden, and standardize OS components for research and prototyping devices. Implement and optimize C++ code paths for performance, reliability, and maintainability. Bring up features on Qualcomm‑based hardware; debug system issues spanning OS, drivers, and services. Collaborate with cross‑functional teams to land changes, reviews, and releases. Must‑have qualifications C++ expertise for Android development (production‑level; debugging & performance tuning). Hands‑on experience with Android 14 (UpsideDownCake) or higher. Qualcomm platform experience (SoCs, BSPs, display/graphics, drivers, perf tools). Solid knowledge of AOSP/Android internals (frameworks, system services, HALs; Linux fundamentals). Nice‑to‑have Java experience within Android (beneficial but not required). Scripting (e.g., Python/Bash) for tooling, build, or automation. Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law .
    $80-85 hourly 2d ago
  • Software Engineer

    Talent Software Services 3.6company rating

    Data engineer job in Redmond, WA

    Are you an experienced Software Engineer with a desire to excel? If so, then Talent Software Services may have the job for you! Our client is seeking an experienced Software Engineer to work at their company in Redmond, WA. The main function of a Lab/Test Engineer at this level is to apply configuration skills at an intermediate to high level. The Test Engineer will analyze, design and develop test plans and should be familiar with at least one programming language. We're on the lookout for a contract Engineer with extensive experience in configuring and testing hardware devices across Windows Server and Ubuntu Server platforms. The ideal candidate will not only be technically adept but also possess strong analytical skills, capable of producing comprehensive and detailed reports. Proficiency in scripting languages is essential. The role involves deploying and managing test machines, refining test plans, executing test cases, performing hardware diagnostics, troubleshooting issues, and collaborating closely with the development team to advance the functionality of hardware systems. Experience with CI/CD pipelines, C++ and Rust development will be considered a significant asset. The main function of a Lab/Test Engineer at this level is to apply configuration skills at an intermediate to high level. The Test Engineer will analyse, design and develop test plans and should be familiar with at least one programming language. Primary Responsibilities/Accountabilities: Perform repeatable testing procedures and processes. Verify triggers, stored procedures, referential integrity, hardware product or system specifications. Interpret and modify code as required, which may include C/C++, C,# batch files, make files, Perl scripts, queries, stored procedures and/or triggers. Identifies and defines project team quality and risk metrics. Provides assistance to other testers. Designs and develops robust automated test harnesses with a focus on Application/System/Inter-System level issues. Perform job functions within the scope of application/system performance, threading issues, bottleneck identification, writing small footprint and less intrusive code for critical code testing, tackling system/application intermittent failures, etc. Purpose of the Team: The purpose of this team is to focus on security hardware and intellectual property. Their work is primarily open source, with some potential for internal code review. Key projects: This role will contribute to supporting development and testing for technologies deployed in the Azure fleet. Typical task breakdown and operating rhythm: The role will consist of 10% meetings, 10% reporting, and 80% heads down (developing and testing). Qualifications: Years of Experience Required: 8-10+ overall years of experience in the field. Degrees or certifications required: N/A Best vs. Average: The ideal resume would contain Rust experience, experience with open-source projects, Performance Indicators: Performance will be assessed based on quality of work, meeting deadlines, and flexibility. Minimum 8+ years of experience with test experience with data centre/server hardware. Minimum 8+ years of experience with development experience with C++ (and Python). Minimum 2+ years of experience with an understanding of CI/CD and ADO pipelines.\ Software testing experience in Azure Cloud/Windows/Linux server environments required. Ability to read and write at least one programming language such as C#, C/C++, SQL, etc, RUST is a plus! Knowledge of software quality assurance practices, with strong testing aptitude. Knowledge of personal computer hardware is required as is knowledge of deploying and managing hosts and virtual test machines Knowledge of internet protocols and networking fundamentals preferred. Must have a solid understanding of the software development cycle. Demonstrated project management ability required. Experience with CI/CD pipelines Bachelor's degree in Computer Science required and some business/functional knowledge and/or industry experience preferred. 5-7 years' experience. 8-10 years' experience. Preferred: Database programming experience, i.e. SQL Server, Sybase, Oracle, Informix and/or DB2 may be required. Software testing experience in a Web-based or Windows client/server environment required. Experience in development and/or database administration experience using a product is required. Ability to read and write at least one programming language, such as C#, C/C++, SQL, etc. Knowledge of software quality assurance practices, with strong testing aptitude. Knowledge of personal computer hardware may be required. Knowledge of internet protocols and networking fundamentals preferred. Must have a solid understanding of the software development cycle. Demonstrated project management ability required.
    $109k-145k yearly est. 1d ago
  • BE Software Engineer (Block Storage)

    Bayside Solutions 4.5company rating

    Data engineer job in Seattle, WA

    Backend Software Engineer (Block Storage) W2 Contract Salary Range: $114,400 - $135,200 per year We are looking for collaborative, curious, and pragmatic Software Engineers to be part of this innovative team. You will be able to shape the product's features and architecture as it scales orders of magnitude. Being part of our Cloud Infrastructure organization opens the door to exerting cross-functional influence and making a more significant organizational impact. Requirements and Qualifications: Proficient with UNIX/Linux Coding skills in one or more of these programming languages: Rust, C++, Java or C# Experience with scripting languages (Bash, Python, Perl) Excellent knowledge of software testing methodologies & practices 2 years of professional software development experience Strong ownership and track record of delivering results Excellent verbal and written communication skills Bachelor's Degree in Computer Science, an engineering-related field, or equivalent related experience. Preferred Qualifications: Proficiency in Rust Experience with high-performance asynchronous IO systems programming Knowledge on distributed systems Desired Skills and Experience Proficient with UNIX/Linux, Rust, C++, Java, C#, Bash, Python, Perl, software testing methodologies, professional software development, strong ownership, results-driven delivery, excellent communication skills, computer science or engineering degree, high-performance asynchronous IO systems programming, distributed systems Bayside Solutions, Inc. is not able to sponsor any candidates at this time. Additionally, candidates for this position must qualify as a W2 candidate. Bayside Solutions, Inc. may collect your personal information during the position application process. Please reference Bayside Solutions, Inc.'s CCPA Privacy Policy at *************************
    $114.4k-135.2k yearly 2d ago
  • Senior Software Engineer (Azure Databricks, DLT Pipelines, Terraform Dev, CD/CI, Data Platform) Contract at Bellevue, WA

    Red Oak Technologies 4.0company rating

    Data engineer job in Bellevue, WA

    Senior Software Engineer (Azure Databricks, DLT Pipelines, Coding, CD/CI, Data Platform & Data Integration) Contract at Bellevue, WA Must Have Experience: Hands-on experience with Azure Databricks/DLT Pipelines (Delta Live Tables) Good programming skills - C#, Java or Python CI/CD experience Data platform/Data integration experience The Role / Responsibilities The Senior Software Engineer, is a hands-on engineer who works from design through implementation of large-scale systems that is data centric for the MA Platform. This is a thought leadership role in the Data Domain across all of Client's' Analytics, with the expectation that the candidate will demonstrate and propagate best practices and processes in software development. The candidate is expected to drive things on their own with minimal supervision from anyone. • Design, code, test, and develop features to support large-scale data processing pipelines, for our multi-cloud SaaS platform with good quality, maintainability, and end to end ownership. • Define and leverage data models to understand cost drivers, to create concrete action plans that address platform concerns on Data. Qualifications • 5+ years of experience in building and shipping production grade software systems or services, with one or more of the following: Distributed Systems, large-scale data processing, data storage, Information Retrieval and/or Data Mining, Machine Learning fundamentals. • BS/MS/ in Computer Science or equivalent industry experience. • Experience building and operating online services and fault-tolerant distributed systems at internet scale. • Demonstrable experience shipping software, internet scale services using GraphQL/REST API(s) on Microsoft Azure and/or Amazon Web Services(AWS) cloud. • Experience writing code in C++/C#/Java using agile and test-driving development (TDD). • 3+ years in cloud service development - Azure or AWS services. Preferred Qualifications • Excellent verbal and written communications skills (to engage with both technical and non-technical stakeholders at all levels). • Familiarity with Extract Transform Load (ETL) Pipelines, Data Modelling, Data Engineering and past ML experience is a plus. • Experience in Data Bricks and/or Microsoft Fabric will be an added plus. • Hands-on experience using distributed computing platforms like Apache Spark, Apache Flink Apache Kafka or Azure EventHub.
    $125k-176k yearly est. 3d ago
  • Senior UX Developer

    Centific

    Data engineer job in Bellevue, WA

    Role: UX Developer - Frontend Systems & Design Stack: TypeScript, Vite, React Employment Type: Full-Time About Your Role We're looking for a UX-focused Frontend Developer who can bring clean, scalable, and intuitive interfaces to life, ith minimal design oversight. Our mission is deeply impactful, and your work will shape how operators interact with AI and simulation systems designed to safeguard our country. You'll help define reusable component systems and build UX patterns that withstand complexity. This role is perfect for someone who codes with craft and cares about every user interaction, from micro-interactions to performance. Build the interface to the future. Your code will not only enable national security but drive how humans collaborate with AI in critical missions. Key Responsibilities Build and maintain reusable, high-quality UI components Translate complex agent behaviours and simulation results into usable interfaces Own frontend architecture decisions and interaction paradigms Collaborate closely with backend and AI teams to integrate data-rich UIs Ideal Candidate: Deep expertise in TypeScript and modern frontend tooling (Vite, React) Passion for interaction design and performance at scale Self-directed with strong product instincts and attention to UX details Eager to work on purpose-driven, impactful technology Benefits: Comprehensive healthcare, dental, and vision coverage 401k plan Paid time off (PTO) And more! Company Overview: Centific is a frontier AI data foundry that curates diverse, high-quality data, using our purpose-built technology platforms to empower the Magnificent Seven and our enterprise clients with safe, scalable AI deployment. Our team includes more than 150 PhDs and data scientists, along with more than 4,000 AI practitioners and engineers. We harness the power of an integrated solution ecosystem-comprising industry-leading partnerships and 1.8 million vertical domain experts in more than 230 markets-to create contextual, multilingual, pre-trained datasets; fine-tuned, industry-specific LLMs; and RAG pipelines supported by vector databases. Our zero-distance innovation™ solutions for GenAI can reduce GenAI costs by up to 80% and bring solutions to market 50% faster. Our mission is to bridge the gap between AI creators and industry leaders by bringing best practices in GenAI to unicorn innovators and enterprise customers. We aim to help these organizations unlock significant business value by deploying GenAI at scale, helping to ensure they stay at the forefront of technological advancement and maintain a competitive edge in their respective markets. Learn more about us at www. centific.com Centific is an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, ancestry, citizenship status, age, mental or physical disability, medical condition, sex (including pregnancy), gender identity or expression, sexual orientation, marital status, familial status, veteran status, or any other characteristic protected by applicable law. We consider qualified applicants regardless of criminal histories, consistent with legal requirements.
    $110k-152k yearly est. 1d ago
  • Software Engineer Qualtrics

    Mainz Brady Group

    Data engineer job in Beaverton, OR

    HYBRID ONISTE IN BEAVERTON, OR! MUST HAVE QUALTRICS EXP We're seeking a skilled and experienced Software Engineer who specializes in Qualtrics. This role will be part of a high-visibility, high-impact initiative to optimize and expand our Qualtrics environment. You'll play a key role in designing, developing, and maintaining scalable solutions that enhance user experience, streamline data collection, and improve reporting accuracy. The ideal candidate has a strong background in Qualtrics architecture, API integrations, and automation-plus a passion for creating efficient, user-friendly tools that empower teams to make data-driven decisions. What we're looking for: 3+ years of hands-on Qualtrics engineering or development experience Strong understanding of survey logic, workflows, APIs, and automation Experience with data visualization and analytics tools (Tableau, Power BI, etc.) Background in software engineering (JavaScript, Python, or similar) Ability to partner cross-functionally with researchers, analysts, and product teams
    $77k-108k yearly est. 4d ago

Learn more about data engineer jobs

How much does a data engineer earn in Pasco, WA?

The average data engineer in Pasco, WA earns between $78,000 and $147,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Pasco, WA

$107,000
Job type you want
Full Time
Part Time
Internship
Temporary