Post job

Data engineer jobs in Puyallup, WA

- 3,457 jobs
All
Data Engineer
Requirements Engineer
Software Engineer
Senior Software Engineer
Data Scientist
Data Architect
  • Product Data Scientist

    Dynpro Inc.

    Data engineer job in Seattle, WA

    About the Role We are looking for an experienced Product Data Scientist to support data-driven product development and decision-making. In this role, you will partner closely with Product, Engineering, UX, and business stakeholders to deliver actionable insights that improve product experience, engagement, and adoption. You will be embedded within specific product areas, owning the product data strategy and helping teams move from intuition to evidence through metrics, experimentation, and analytics. Key Responsibilities Partner with Product, Engineering, UX, and business teams to identify opportunities for product improvements, new features, and enhanced customer experience Serve as the subject matter expert for product data and analytics within assigned product areas Define product success metrics and key performance indicators (KPIs), and ensure alignment with business goals Design, analyze, and interpret A/B tests and experiments to evaluate product changes Analyze large and complex datasets to uncover insights related to product usage, engagement, and retention Build and maintain automated dashboards and reports that enable self-service analytics for stakeholders Collaborate with data engineering and analytics teams to ensure reliable data pipelines and high data quality Translate analytical findings into clear, actionable recommendations for senior leadership Evangelize data-driven insights, frameworks, and best practices across teams Act with urgency, proactively identify roadblocks, and work cross-functionally to drive impact Basic Qualifications Bachelor's degree in a quantitative field (Statistics, Mathematics, Computer Science, Economics, Finance) or equivalent practical experience 7+ years of experience in data science, product analytics, or business analytics within a SaaS or technology environment Strong experience using SQL for data analysis and validation Proven ability to analyze data from multiple sources and distill insights into clear business recommendations Hands-on experience with data visualization tools such as Tableau, Power BI, Looker, or similar Solid foundation in statistics and experimental design Experience working closely with Product and Engineering teams Strong communication skills with the ability to explain technical concepts to non-technical audiences Familiarity with agile or scrum product development methodologies Preferred Qualifications Experience with A/B testing, cohort analysis, user segmentation, and product experimentation Experience building or refining machine learning models Proficiency in Python or R Familiarity with data engineering concepts and analytics pipelines Strong stakeholder management skills and the ability to lead through influence Highly motivated self-starter who thrives in a fast-paced, collaborative environment
    $94k-133k yearly est. 3d ago
  • Data Engineer II: 25-07190

    Akraya, Inc. 4.0company rating

    Data engineer job in Seattle, WA

    Process Skills: Data Pipeline (Proficient), SQL (Expert), Python(Proficient), ETL (Expert), QuickSight (Intermediate) Contract Type: W2 Only Duration: 5+ months (high possibility of conversion) Pay Range: $65.00 - $70.00 Per Hour #LP Summary: Join the Sustain.AI team as a Data Engineer to lead the pivotal migration of sustainability data to modern AWS infrastructure, fostering a net-zero future. This role involves transforming and automating sustainability workflows into data-driven processes, significantly impacting our environmental goals. Your work will directly contribute to reducing the carbon footprint by creating an automated, scalable data infrastructure and ensuring precise data quality and consistency. Key Responsibilities: Migrate ETL jobs from legacy to AWS, ensuring no operational disruptions. Set up and manage AWS data services (Redshift, Glue) and orchestrate workflows. Transform existing workflows into scalable, efficient AWS pipelines with robust validation. Collaborate with various teams to understand and fulfill data requirements for sustainability metrics. Document new architecture, implement data quality checks, and communicate with stakeholders on progress and challenges. Must-Have Skills: Advanced proficiency in ETL Pipeline and AWS data services (Redshift, S3, Glue). Expertise in SQL and experience with Python or Spark for data transformation. Proven experience in overseeing data migrations with minimal disruption. Industry Experience Required: Experience in sustainability data, carbon accounting, or environmental metrics is highly preferred. Familiarity with large-scale data infrastructure projects in complex enterprise environments is essential. ABOUT AKRAYA Akraya is an award-winning IT staffing firm consistently recognized for our commitment to excellence and a thriving work environment. Most recently, we were recognized Inc's Best Workplaces 2024 and Silicon Valley's Best Places to Work by the San Francisco Business Journal (2024) and Glassdoor's Best Places to Work (2023 & 2022)! Industry Leaders in IT Staffing As staffing solutions providers for Fortune 100 companies, Akraya's industry recognitions solidify our leadership position in the IT staffing space. We don't just connect you with great jobs, we connect you with a workplace that inspires! Join Akraya Today! Let us lead you to your dream career and experience the Akraya difference. Browse our open positions and join our team!
    $65-70 hourly 2d ago
  • Staff Data Engineer

    Eton Solution 3.7company rating

    Data engineer job in Bellevue, WA

    *Immigration sponsorship is not available in this role* We are looking for an experienced Data Engineer (8+ years of experience) with deep expertise in Flink SQL to join our engineering team. This role is ideal for someone who thrives on building robust real-time data processing pipelines and has hands-on experience designing and optimizing Flink SQL jobs in a production environment. You'll work closely with data engineers, platform teams, and product stakeholders to create scalable, low-latency data solutions that power intelligent applications and dashboards. ⸻ Key Responsibilities: • Design, develop, and maintain real-time streaming data pipelines using Apache Flink SQL. • Collaborate with platform engineers to scale and optimize Flink jobs for performance and reliability. • Build reusable data transformation logic and deploy to production-grade Flink clusters. • Ensure high availability and correctness of real-time data pipelines. • Work with product and analytics teams to understand requirements and translate them into Flink SQL jobs. • Monitor and troubleshoot job failures, backpressure, and latency issues. • Contribute to internal tooling and libraries that improve Flink developer productivity. Required Qualifications: • Deep hands-on experience with Flink SQL and the Apache Flink ecosystem. • Strong understanding of event time vs processing time semantics, watermarks, and state management. • 3+ years of experience in data engineering, with strong focus on real-time/streaming data. • Experience writing complex Flink SQL queries, UDFs, and windowing operations. • Proficiency in working with streaming data formats such as Avro, Protobuf, or JSON. • Experience with messaging systems like Apache Kafka or Pulsar. • Familiarity with containerized deployments (Docker, Kubernetes) and CI/CD pipelines. • Solid understanding of distributed system design and performance optimization. Nice to Have: • Experience with other stream processing frameworks (e.g., Spark Structured Streaming, Kafka Streams). • Familiarity with cloud-native data stacks (AWS Kinesis, GCP Pub/Sub, Azure Event Hub). • Experience in building internal tooling for observability or schema evolution. • Prior contributions to the Apache Flink community or similar open-source projects. Why Join Us: • Work on cutting-edge real-time data infrastructure that powers critical business use cases. • Be part of a high-caliber engineering team with a culture of autonomy and excellence. • Flexible working arrangements with competitive compensation.
    $95k-134k yearly est. 5d ago
  • ONLY W2 & LOCAL CANDIDATES for Data Engineer in Seattle, WA (Hybrid Role)

    Infotree Global Solutions 4.1company rating

    Data engineer job in Seattle, WA

    Minimum Requirements: 5+ years of experience. Data Engineering: Experience with cloud platforms (AWS, Azure, GCP). Houman Bedayat, do they need SQL or other Data Science skills, data warehouse, data virtualization? Strong Data quality focus: Implementing data validation, cleansing, transformation. Data Versioning Programming: Strong proficiency in Python and related data science libraries (e.g., pandas, NumPy). Experience with distributed computing. Experience dealing with image-type and array-type data. Automation: Able to automate repetitive tasks and processes. Experience with scripting and automation tools.
    $92k-125k yearly est. 3d ago
  • Data Engineer

    OSI Engineering 4.6company rating

    Data engineer job in Seattle, WA

    A globally leading consumer device company based in Seattle, WA is looking for a ML Data Pipeline Engineer to join their dynamic team! Job Responsibilities: • Assist team in building, maintaining, and running essential ML Data Pipelines. Minimum Requirements: 5 year+ experience Experience with cloud platforms (AWS, Azure, GCP) Strong Data quality focus - implementing data validation, cleansing, transformation Strong proficiency in Python and related data science libraries (e.g., pandas, NumPy). Experience with distributed computing Experience dealing with image-type and array-type data Able to automate repetitive tasks and processes Experience with scripting and automation tools Able to execute tasks independently and creatively Languages: Fluent in English or Farsi Type: Contract Duration: 12 months with extension Work Location: Seattle, WA (hybrid) Pay Rate Range: $40.00 - $55.00 (DOE)
    $40-55 hourly 5d ago
  • Data Architect

    Fusion Plus Solutions, Inc.

    Data engineer job in Redmond, WA

    We are seeking a Data Platform Architect to design and implement a scalable, metadata-driven data platform using Microsoft Fabric and the Azure ecosystem. This role focuses on architecting the foundation for data ingestion, storage, governance, and consumption across multiple domains, enabling advanced analytics and AI-driven insights. Key Responsibilities: Define and implement the end-to-end architecture for a metadata-driven Lakehouse platform. Establish standards for data ingestion, transformation, and orchestration using Microsoft Fabric and Azure services. Design multi-tenant, domain-specific data lakes and an orchestration layer for cross-agent queries. Ensure data governance, lineage, and security across the platform. Collaborate with engineering teams to implement best practices for scalability, performance, and cost optimization. Integrate analytics and AI capabilities (Power BI, Azure OpenAI) into the platform. Required Qualifications: 10+ years of experience in data architecture and platform design. Expertise in: Microsoft Fabric (Lakehouse, Dataflows Gen2, Pipelines, Notebooks) Azure Data Factory, Azure SQL, Data Lake Storage Gen2 Data modeling, governance, and security frameworks Strong understanding of metadata-driven architectures and reusable components. Experience with structured and semi-structured data formats.
    $93k-129k yearly est. 2d ago
  • PAM Platform Engineer

    Tailored Management 4.2company rating

    Data engineer job in Seattle, WA

    Job Title: Privileged Access Management - Beyond Trust Engineer Duration: 06+ month contract (with possible extension) Pay Rate: $97.31/hr on W2 Benefits: Medical, Dental, Vision. Job Description: Summary: As a PAM Platform Engineer on Client's Identity & Access Management team, you'll be a key technical specialist responsible for designing, implementing, and maintaining our enterprise-wide Privileged Access Management infrastructure using Beyond Trust. You'll lead the rollout of Beyond Trust and support ongoing management of our privileged access solutions, including password management, endpoint privilege management, and session management capabilities across our retail technology ecosystem. Join our cybersecurity team to drive enterprise-level PAM adoption while maintaining Client's commitment to innovation, security excellence, and work-life balance. A day in the life... PAM Platform Leadership: Serve as the primary technical expert for privileged access management solutions, including architecture, deployment, configuration, and optimization of password vaults and endpoint privilege management systems Enterprise PAM Implementation: Design and execute large-scale PAM deployments across Windows, mac OS, and Linux environments, ensuring seamless integration with existing infrastructure Policy Development & Management: Create and maintain privilege elevation policies, credential rotation schedules, access request workflows, and governance rules aligned with security and compliance requirements Integration & Automation: Integrate PAM solutions with ITSM platforms, SIEM tools, vulnerability scanners, directory services, and other security infrastructure to create comprehensive privileged access workflows Troubleshooting & Support: Provide expert-level technical support for PAM platform issues, performance optimization, privileged account onboarding, and user access requests Security & Compliance: Ensure PAM implementations meet PCI DSS, and other requirements through proper audit trails, session recording and monitoring, and privileged account governance Documentation & Training: Develop technical documentation, procedures, and training materials for internal teams and end users Continuous Improvement: Monitor platform performance, evaluate new features, and implement best practices to enhance security posture and operational efficiency You own this if you have... Required Qualifications: 4-6+ years of hands-on experience implementing and managing Beyond Trust PAM at the Enterprise level. Beyond Trust certifications are preferred. Deep expertise in privileged account discovery, credential management, password rotation, session management, and access request workflows using Beyond Trust Strong understanding of Windows Server administration, Active Directory, Group Policy, and PowerShell scripting Experience with Linux/Unix system administration and shell scripting for cross-platform Beyond Trust PAM deployments Knowledge of networking fundamentals including protocols, ports, certificates, load balancing, and security hardening Experience with cloud platforms (AWS, Azure) and containerization technologies (Docker, Kubernetes) Understanding of identity and access protocols (SAML, OIDC, OAuth, SCIM, LDAP) and their integration with PAM solutions Preferred Qualifications: Knowledge of DevOps practices, CI/CD pipelines, and Infrastructure as Code (Terraform, Ansible) Familiarity with ITSM integration (ServiceNow, Jira) for ticket-driven privileged access workflows Experience with SIEM integration and security monitoring platforms (Splunk, QRadar, etc.) Understanding of zero trust architecture and least privilege access principles Experience with secrets management platforms (HashiCorp Vault, AWS Secrets Manager, Azure Key Vault) Previous experience in retail technology environments or large-scale enterprise deployments Industry certifications such as CISSP, CISM, or relevant cloud security certifications Technical Skills: PAM Platforms: Experience with BeyondTrust. Operating Systems: Windows Server (2016/2019/2022), Windows 10/11, mac OS, RHEL, Ubuntu, SUSE Databases: SQL Server, MySQL, PostgreSQL, Oracle for PAM backend configuration Virtualization: VMware vSphere, Hyper-V, cloud-based virtual machines Scripting: PowerShell, Bash, Python for automation and integration tasks Security Tools: Integration experience with vulnerability scanners, endpoint detection tools, and identity governance platforms Hiring a PAM Platform Engineer (BeyondTrust) to lead enterprise-level privileged access security and drive our next-gen IAM transformation!
    $97.3 hourly 4d ago
  • Software Dev Engineer

    Collabera 4.5company rating

    Data engineer job in Redmond, WA

    Title: Software Dev Engineer Required Skills & Qualifications 4-10 years of experience in software development. Strong proficiency in Python and backend development (APIs, business logic, integrations). Experience with AWS Lambda, DynamoDB, and serverless architecture. Hands-on experience with React for frontend development. Proficient in scripting (Python, Bash, or similar). Experience working with databases: Preferred: DynamoDB Also accepted: SQL-based DBs or MongoDB Solid understanding of REST APIs, microservices, and cloud-based application design. Nice-to-Have Skills Experience with CI/CD pipelines (CodePipeline, GitHub Actions, Jenkins, etc.) Knowledge of infrastructure-as-code tools such as CloudFormation, AWS CDK, or other IaC frameworks. Familiarity with containerization (Docker) is a plus. Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: medical insurance, dental insurance, vision insurance, 401(k) retirement plan, life insurance, long-term disability insurance, short-term disability insurance, paid parking/public transportation, (paid time , paid sick and safe time , hours of paid vacation time, weeks of paid parental leave, paid holidays annually - As Applicable)
    $103k-141k yearly est. 5d ago
  • Databricks Engineer

    Tata Consultancy Services 4.3company rating

    Data engineer job in Seattle, WA

    5+ years of experience in data engineering or similar roles. Strong expertise in Databricks, Apache Spark, and PySpark. Proficiency in SQL, Python, and data modeling concepts. Experience with cloud platforms (Azure preferred; AWS/GCP is a plus). Knowledge of Delta Lake, Lakehouse architecture, and partitioning strategies. Familiarity with data governance, security best practices, and performance tuning. Hands-on experience with version control (Git) and CI/CD pipelines. Roles & Responsibilities: Design and develop ETL/ELT pipelines using Azure Databricks and Apache Spark. Integrate data from multiple sources into the data lake and data warehouse environments. Optimize data workflows for performance and cost efficiency in cloud environments (Azure/AWS/GCP). Implement data quality checks, monitoring, and alerting for pipelines. Collaborate with data scientists and analysts to provide clean, curated datasets. Ensure compliance with data governance, security, and privacy standards. Automate workflows using CI/CD pipelines and orchestration tools (e.g., Airflow, Azure Data Factory). Troubleshoot and resolve issues in data pipelines and platform components. TCS Employee Benefits Summary: Discretionary Annual Incentive. Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. Family Support: Maternal & Parental Leaves. Insurance Options: Auto & Home Insurance, Identity Theft Protection. Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement. Time Off: Vacation, Time Off, Sick Leave & Holidays. Legal & Financial Assistance: Legal Assistance , 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing. #LI-RJ2 Salary Range - $100,000-$140,000 a year
    $100k-140k yearly 5d ago
  • Platform Engineer

    Info Way Solutions 4.3company rating

    Data engineer job in Seattle, WA

    Key Responsibilities PAM Architecture & Implementation Lead architecture, deployment, configuration, and optimization of enterprise PAM solutions (e.g., BeyondTrust, CyberArk). Design and execute PAM implementations including password vaulting, endpoint privilege management, session recording, and credential rotation. Policy & Governance Develop and maintain privilege elevation policies, governance workflows, and access request processes. Enforce least-privilege access controls and compliance standards (PCI DSS, internal policies). Integration & Automation Integrate PAM with ITSM (e.g., ServiceNow), SIEM tools (e.g., Splunk), vulnerability scanners, directory services (e.g., Active Directory/LDAP), and other security systems. Build automation for onboarding, credential rotation, and auditing. Support & Troubleshooting Provide expert-level technical support for PAM incidents, performance issues, and privileged account onboarding. Analyze log data and implement improvements to performance and security posture. Documentation & Training Develop technical documentation, runbooks, procedures, and training materials for internal teams and end users. Educate stakeholders on PAM best practices.
    $97k-123k yearly est. 3d ago
  • Azure& GCP Engineer

    Amroute

    Data engineer job in Seattle, WA

    Job Title: Azure& GCP Engineer Duration: 6 months We are seeking a highly skilled and experienced Azure and Google Cloud Engineer to join our team. The ideal candidate will be responsible for troubleshooting and managing Azure as well as Google Cloud solutions that drive business transformation. This role requires a deep understanding of Cloud services, strong architectural principles, and the ability to translate business requirements into scalable, reliable, and secure cloud solutions. Key Responsibilities: Technical Leadership: Lead and mentor a team of cloud engineers and developers, providing guidance on best practices and technical issues. Act as a subject matter expert for Azure, staying current with the latest services, tools, and best practices. o Develop comprehensive cloud architectures leveraging GCP services such as Compute Engine, Kubernetes Engine, BigQuery, Pub/Sub, Cloud Functions, and others. o Design scalable, secure, and cost-effective cloud solutions to meet business and technical requirements. Cloud Strategy and Roadmap: o Define a long-term cloud strategy for the organization, including migration plans, optimization, and governance frameworks. o Assess and recommend best practices for cloud-native and hybrid cloud solutions. Solution Implementation: o Implement CI/CD pipelines, monitoring, and infrastructure-as-code (e.g., Terraform, Cloud Deployment Manager). Collaboration and Leadership: o Work closely with development, operations, and business teams to understand requirements and provide technical guidance. o Mentor junior team members and foster a culture of continuous learning and innovation. Performance Optimization: o Optimize GCP services for performance, scalability, and cost efficiency. o Monitor and resolve issues related to cloud infrastructure, applications, and services. Documentation and Reporting: o Create and maintain technical documentation, architectural diagrams, and operational runbooks. o Provide regular updates and reports to stakeholders on project progress, risks, and outcomes. Required Skills and Qualifications: • Education: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). • Experience: o 7+ years of experience in cloud architecture and 4+ years specifically with GCP. o Proven expertise in designing and implementing large-scale cloud solutions. o Experience with application modernization using containers, microservices, and serverless architectures. • Technical Skills: o Proficiency in GCP services (e.g., BigQuery, Cloud Spanner, Kubernetes Engine, Cloud Run, Dataflow). o Strong experience with Infrastructure as Code (e.g., Terraform, Deployment Manager). o Knowledge of DevOps practices, CI/CD pipelines, and container orchestration tools. o Familiarity with databases (relational and NoSQL) and data pipelines. • Certifications: GCP certifications such as Professional Cloud Architect or Professional Data Engineer are highly preferred. Solution Development and Deployment: Oversee the deployment, management, and maintenance of cloud applications. Automate deployment and configuration management processes using Infrastructure as Code (IaC) tools such as ARM templates, Terraform, or Azure Bicep. Develop disaster recovery and business continuity plans for cloud services. Collaboration and Communication: Collaborate with cross-functional teams including development, operations, and security to ensure seamless integration and operation of cloud systems. Communicate effectively with stakeholders to understand business requirements and provide cloud solutions that meet their needs. Performance and Optimization: Monitor and optimize the performance of cloud systems to ensure they meet service level agreements (SLAs). Implement cost management strategies to optimize cloud spending. Security and Compliance: Ensure that all cloud solutions comply with security policies and industry regulations. Implement robust security measures, including identity and access management (IAM), network security, and data protection. Continuous Improvement: Drive continuous improvement initiatives to enhance the performance, reliability, and scalability of cloud solutions. Participate in architecture reviews and provide recommendations for improvements. Technical Skills: Extensive experience with Google and Microsoft Azure services including but not limited to Azure Virtual Machines, Azure App Services, Azure Functions, Azure Kubernetes Service (AKS), and Azure SQL Database. Proficiency in Azure networking, storage, and database services Strong skills in Infrastructure as Code (IaC) tools such as ARM templates, Terraform, or Azure Bicep. Experience with continuous integration/continuous deployment (CI/CD) pipelines using Azure DevOps or other similar tools. Deep understanding of cloud security best practices and tools including Azure Security Center, Azure Key Vault, and Azure Policy. Familiarity with compliance frameworks such as GDPR, HIPAA, and SOC 2. Proficiency in scripting languages like PowerShell, Python, or Bash for automation tasks. Experience with configuration management tools like Ansible or Chef is a plus. Qualifications: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Master's degree preferred. Experience: Minimum of 5 years of experience in cloud architecture with a focus on Microsoft Azure. Proven track record of designing and deploying large-scale cloud solutions. Certifications: Relevant Azure certifications such as Microsoft Certified: Azure Solutions Architect Expert, Microsoft Certified: Azure DevOps Engineer Expert, or similar.
    $87k-126k yearly est. 1d ago
  • Founding Applied AI Engineer

    Prime Team Partners

    Data engineer job in Seattle, WA

    Founding Engineer - Early-Stage AI Startup Join a venture-backed, seed stage, fast-growing team led by repeat founders with a track record of successful AI company exits. They are building foundational AI infrastructure to solve one of the most pressing challenges in modern workflows: fragmented context. They help teams work smarter and deliver faster by building continuity between humans and AI Why You'll Want to Join Be among the first engineering hires, shaping both the product and company culture alongside experienced founders and a world-class technical team. Work on cutting-edge retrieval and memory systems (embeddings, vector search, graph-based architectures) powering context-aware AI applications. Enjoy significant ownership and autonomy-drive projects from ideation to production, experiment with new approaches, and influence technical direction. Competitive compensation and equity, with flexibility to tailor your package based on what matters most to you. Comprehensive benefits, including generous healthcare coverage for you and your dependents, and flexible PTO. Hybrid work model: collaborate in-person with the founding team in Seattle, with flexibility for remote days. Opportunity to earn a “Founding Engineer” title for the right candidate, reflecting your impact and early contributions. Backed by top-tier investors and fresh off a successful fundraise, you'll have the resources and support to build something meaningful. The role Architect and implement core AI systems for knowledge extraction, retrieval, and orchestration. Design and optimize retrieval and memory infrastructure (e.g., GraphRAG, vector search) for low-latency, high-precision context delivery. Build agentic systems for autonomous reasoning, synthesis, and evaluation. Develop prompt and orchestration frameworks connecting multiple models, tools, and data sources. Prototype and validate new product directions, collaborating closely with founders and cross-functional teammates. The experience we seek Strong Python engineering skills and experience building scalable, maintainable systems. Deep knowledge of retrieval architectures (embeddings, vector DBs, hybrid/graph search). Hands-on experience with LLM orchestration frameworks (LangChain, LlamaIndex, or similar). Proven ability to build and tune LLM-based agents for complex reasoning tasks. Startup DNA: thrive in ambiguity, take ownership, and move fast. Passion for building in the AI space and excitement about joining an early-stage team. If you're ready to have a massive impact, learn from proven founders, and help define the future of human-AI collaboration, we'd love to connect. Prime Team Partners is an equal opportunity employer. Prime Team Partners does not discriminate on the basis of race, color, religion, national origin, pregnancy status, gender, age, marital status, disability, medical condition, sexual orientation, or any other characteristics protected by applicable state or federal civil rights laws. For contract positions, hired candidates will be employed by Prime Team for the duration of the contract period and be eligible for our company benefits. Benefits include medical, dental and vision. Employees are covered at 75%. We offer a 401K after 6 months, we do not provide paid holidays or PTO, sick time is offered in accordance with local laws
    $87k-126k yearly est. 5d ago
  • Kettle Engineer

    Soho Square Solutions

    Data engineer job in Seattle, WA

    Job Title: Kettle Engineer Type: Contract We're seeking a Kettle Engineer to design, operate, and continuously improve our Pentaho Data Integration (PDI/Kettle) platform and data movement processes that underpin business‑critical workflows. You will own end‑to‑end lifecycle management-from environment build and configuration to orchestration, monitoring, and support-partnering closely with application teams, production operations, and data stakeholders. The ideal candidate combines strong hands‑on Kettle expertise with solid SQL, automation, and production support practices in a fast‑moving, highly collaborative environment. Primary Responsibilities Platform ownership: Install, configure, harden, and upgrade Kettle/PDI components (e.g., Spoon) across dev/prod. Process engineering: Migrate, re-engineer, and optimize jobs and transformations for existing transformations. Reliability & support: Document all workflows including ownership and escalation protocols. Knowledge transfer with Automation and Application Support teams. Observability: Implement proactive monitoring, logging, and alerting for the Kettle platform including all dependent processes. Collaboration: Partner with application, data, and infrastructure teams to deliver improvements to existing designs Required Qualifications Kettle/PDI expertise: Experience installing and configuring a Kettle instance (server and client tools, repositories, parameters, security, and upgrades). Kettle/PDI expertise: Experience creating, maintaining, and supporting Kettle processes (jobs/transformations, error handling, recovery, and performance tuning). 4+ years hands‑on SQL (writing/diagnosing and optimizing queries). Strong communication skills for both technical and non‑technical audiences; effective at documenting and sharing knowledge. Preferred Qualifications Experience integrating Kettle with cloud platforms (AWS and/or Azure); familiarity with containers or Windows/Linux server administration. Exposure to monitoring/observability stacks (e.g., DataDog, CloudWatch, or similar). Scripting/automation for operations (Python, PowerShell, or Bash); experience with REST APIs within Kettle. Background in financial services or other regulated/mission‑critical environments. Key Outcomes (First 90 Days) Stand up or validate a hardened Kettle environment with baseline monitoring and runbooks. Migrate at least two high‑value Kettle workflows using shared templates and standardized error handling.
    $87k-126k yearly est. 5d ago
  • AI Prompt Engineer

    Ascendion

    Data engineer job in Redmond, WA

    : Ascendion is a full-service digital engineering solutions company. We make and manage software platforms and products that power growth and deliver captivating experiences to consumers and employees. Our engineering, cloud, data, experience design, and talent solution capabilities accelerate transformation and impact for enterprise clients. Headquartered in New Jersey, our workforce of 6,000+ Ascenders delivers solutions from around the globe. Ascendion is built differently to engineer the next. Ascendion | Engineering to elevate life We have a culture built on opportunity, inclusion, and a spirit of partnership. Come, change the world with us: Build the coolest tech for world's leading brands Solve complex problems - and learn new skills Experience the power of transforming digital engineering for Fortune 500 clients Master your craft with leading training programs and hands-on experience Experience a community of change makers! Join a culture of high-performing innovators with endless ideas and a passion for tech. Our culture is the fabric of our company, and it is what makes us unique and diverse. The way we share ideas, learning, experiences, successes, and joy allows everyone to be their best at Ascendion. About the Role: Title: Prompt 2 Location:Onsite Summary: The main function of an AI Engineer- Prompt is to design and craft conversational prompts, messages, and responses for AI chatbots, virtual assistants, or customer service applications. Their role revolves around creating prompts that align with organizational objectives, provide a seamless user experience, and ensure effective interactions between users and AI systems. This entails developing a library of pre-defined prompts tailored to various scenarios, industries, and user needs. Job Responsibilities: Fine-tune and improve a variety of sophisticated software implementation projects Gather and analyze system requirements, document specifications, and develop software solutions to meet client needs and data Analyze and review enhancement requests and specifications Implement system software and customize client requirements Prepare the detailed software specifications and test plans Code new programs to client's specifications and create test data for testing Modify existing programs to new standards and conduct unit testing of developed programs Create migration packages for system testing, user testing, and implementation Provide quality assurance reviews Perform post-implementation validation of software and resolve any bugs found during testing. Typical Day in the Role Purpose of the Team: The purpose of this team is to build and operate the middle services that power our AI-powered assistant experiences in Word, Excel, and PowerPoint, with a focus on prompt evaluation and related automation for those experiences. Key projects: This role will contribute to AI-powered assistant within Office apps, including evaluations for prompts, manual/automated testing, test environment setup, and coding activities needed to support these evaluations. Candidate Requirements Best vs. Average: The ideal resume would contain prior LLM evaluation experience, plus data science/experimentation background; Python experience is called out as an A+. Qualifications: Bachelor's degree in technical field such as computer science, computer engineering or related field required 2-4 years' experience required A solid foundation in computer science, with strong competencies in data structures, algorithms, and software design Large systems software design and development experience Experience performing in-depth troubleshooting and unit testing with both new and legacy production systems Experience in programming and experience with problem diagnosis and resolution Top 3 Must-Have HARD Skills Ability to setup synthetic tenant data and data ingestion - test accounts, generate grounding data, configuration as code Maintain, validate, and automate the creation of test datasets for an LLM evaluation system Integrate evaluation quality checks in our build and deployment pipeline, ensuring performance and scalability Thanks Salary Range: $1,56,000 - 1,66,000 Annually - Factors that may affect pay within this range may include geography/market, skills, education, experience and other qualifications of the successful candidate. Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: [medical insurance] [dental insurance] [vision insurance] [09-10 days/hours of paid time off] Want to change the world? Let us know. Tell us about your experiences, education, and ambitions. Bring your knowledge, unique viewpoint, and creativity to the table. Let's talk!
    $87k-126k yearly est. 2d ago
  • Software Engineer

    Insight Global

    Data engineer job in Redmond, WA

    Client: Fortune 500 Company Job Title: Software Engineer IV: AR/VR Prototyping and Engineering Pay Rate: $75hr - $80hr Duration: 1-2 years W2 Job Responsibilities: Build, test, and refine AR/VR research prototypes (e.g., VR experiences that test a new interaction technique or interface) based on the ideas of researchers Build and maintain software prototypes that use inputs from different device and server sources and outputs. Collaborate with researchers and engineers to build novel AR/VR prototypes and algorithms. Collaborate with researchers to run experiments on prototype interactions with end-users. Must Haves: Bachelor's or Master's degree in computer science or related fields. 3+ years of experience building in Unity and C# (ideally for AR/VR) 3+ years of experience utilizing general software engineering skills, including debugging, version control, logging, documentation, code reviewing, etc. Plusses: Experience working with real-time signal processing, sensor fusion, and/or machine learning solutions using real-time continuous data streams (e.g., eye tracking, hand/body tracking, EMG, etc.), integrating them into interactive systems Familiarity with AR/VR/MR technologies. 2+ years of experience programming in Python Experience working with sensors, proficient in machine learning models, wearable devices, input device signals/real time sensor data, and/or computer vision models Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law .
    $75 hourly 2d ago
  • BE Software Engineer (Block Storage)

    Bayside Solutions 4.5company rating

    Data engineer job in Seattle, WA

    Backend Software Engineer (Block Storage) W2 Contract Salary Range: $114,400 - $135,200 per year We are looking for collaborative, curious, and pragmatic Software Engineers to be part of this innovative team. You will be able to shape the product's features and architecture as it scales orders of magnitude. Being part of our Cloud Infrastructure organization opens the door to exerting cross-functional influence and making a more significant organizational impact. Requirements and Qualifications: Proficient with UNIX/Linux Coding skills in one or more of these programming languages: Rust, C++, Java or C# Experience with scripting languages (Bash, Python, Perl) Excellent knowledge of software testing methodologies & practices 2 years of professional software development experience Strong ownership and track record of delivering results Excellent verbal and written communication skills Bachelor's Degree in Computer Science, an engineering-related field, or equivalent related experience. Preferred Qualifications: Proficiency in Rust Experience with high-performance asynchronous IO systems programming Knowledge on distributed systems Desired Skills and Experience Proficient with UNIX/Linux, Rust, C++, Java, C#, Bash, Python, Perl, software testing methodologies, professional software development, strong ownership, results-driven delivery, excellent communication skills, computer science or engineering degree, high-performance asynchronous IO systems programming, distributed systems Bayside Solutions, Inc. is not able to sponsor any candidates at this time. Additionally, candidates for this position must qualify as a W2 candidate. Bayside Solutions, Inc. may collect your personal information during the position application process. Please reference Bayside Solutions, Inc.'s CCPA Privacy Policy at *************************
    $114.4k-135.2k yearly 5d ago
  • Software Engineer

    Talent Software Services 3.6company rating

    Data engineer job in Redmond, WA

    Are you an experienced Software Engineer with a desire to excel? If so, then Talent Software Services may have the job for you! Our client is seeking an experienced Software Engineer to work at their company in Redmond, WA. The main function of a Lab/Test Engineer at this level is to apply configuration skills at an intermediate to high level. The Test Engineer will analyze, design and develop test plans and should be familiar with at least one programming language. We're on the lookout for a contract Engineer with extensive experience in configuring and testing hardware devices across Windows Server and Ubuntu Server platforms. The ideal candidate will not only be technically adept but also possess strong analytical skills, capable of producing comprehensive and detailed reports. Proficiency in scripting languages is essential. The role involves deploying and managing test machines, refining test plans, executing test cases, performing hardware diagnostics, troubleshooting issues, and collaborating closely with the development team to advance the functionality of hardware systems. Experience with CI/CD pipelines, C++ and Rust development will be considered a significant asset. The main function of a Lab/Test Engineer at this level is to apply configuration skills at an intermediate to high level. The Test Engineer will analyse, design and develop test plans and should be familiar with at least one programming language. Primary Responsibilities/Accountabilities: Perform repeatable testing procedures and processes. Verify triggers, stored procedures, referential integrity, hardware product or system specifications. Interpret and modify code as required, which may include C/C++, C,# batch files, make files, Perl scripts, queries, stored procedures and/or triggers. Identifies and defines project team quality and risk metrics. Provides assistance to other testers. Designs and develops robust automated test harnesses with a focus on Application/System/Inter-System level issues. Perform job functions within the scope of application/system performance, threading issues, bottleneck identification, writing small footprint and less intrusive code for critical code testing, tackling system/application intermittent failures, etc. Purpose of the Team: The purpose of this team is to focus on security hardware and intellectual property. Their work is primarily open source, with some potential for internal code review. Key projects: This role will contribute to supporting development and testing for technologies deployed in the Azure fleet. Typical task breakdown and operating rhythm: The role will consist of 10% meetings, 10% reporting, and 80% heads down (developing and testing). Qualifications: Years of Experience Required: 8-10+ overall years of experience in the field. Degrees or certifications required: N/A Best vs. Average: The ideal resume would contain Rust experience, experience with open-source projects, Performance Indicators: Performance will be assessed based on quality of work, meeting deadlines, and flexibility. Minimum 8+ years of experience with test experience with data centre/server hardware. Minimum 8+ years of experience with development experience with C++ (and Python). Minimum 2+ years of experience with an understanding of CI/CD and ADO pipelines.\ Software testing experience in Azure Cloud/Windows/Linux server environments required. Ability to read and write at least one programming language such as C#, C/C++, SQL, etc, RUST is a plus! Knowledge of software quality assurance practices, with strong testing aptitude. Knowledge of personal computer hardware is required as is knowledge of deploying and managing hosts and virtual test machines Knowledge of internet protocols and networking fundamentals preferred. Must have a solid understanding of the software development cycle. Demonstrated project management ability required. Experience with CI/CD pipelines Bachelor's degree in Computer Science required and some business/functional knowledge and/or industry experience preferred. 5-7 years' experience. 8-10 years' experience. Preferred: Database programming experience, i.e. SQL Server, Sybase, Oracle, Informix and/or DB2 may be required. Software testing experience in a Web-based or Windows client/server environment required. Experience in development and/or database administration experience using a product is required. Ability to read and write at least one programming language, such as C#, C/C++, SQL, etc. Knowledge of software quality assurance practices, with strong testing aptitude. Knowledge of personal computer hardware may be required. Knowledge of internet protocols and networking fundamentals preferred. Must have a solid understanding of the software development cycle. Demonstrated project management ability required.
    $109k-145k yearly est. 4d ago
  • Senior Software Engineer (Azure Databricks, DLT Pipelines, Terraform Dev, CD/CI, Data Platform) Contract at Bellevue, WA

    Red Oak Technologies 4.0company rating

    Data engineer job in Bellevue, WA

    Senior Software Engineer (Azure Databricks, DLT Pipelines, Coding, CD/CI, Data Platform & Data Integration) Contract at Bellevue, WA Must Have Experience: Hands-on experience with Azure Databricks/DLT Pipelines (Delta Live Tables) Good programming skills - C#, Java or Python CI/CD experience Data platform/Data integration experience The Role / Responsibilities The Senior Software Engineer, is a hands-on engineer who works from design through implementation of large-scale systems that is data centric for the MA Platform. This is a thought leadership role in the Data Domain across all of Client's' Analytics, with the expectation that the candidate will demonstrate and propagate best practices and processes in software development. The candidate is expected to drive things on their own with minimal supervision from anyone. • Design, code, test, and develop features to support large-scale data processing pipelines, for our multi-cloud SaaS platform with good quality, maintainability, and end to end ownership. • Define and leverage data models to understand cost drivers, to create concrete action plans that address platform concerns on Data. Qualifications • 5+ years of experience in building and shipping production grade software systems or services, with one or more of the following: Distributed Systems, large-scale data processing, data storage, Information Retrieval and/or Data Mining, Machine Learning fundamentals. • BS/MS/ in Computer Science or equivalent industry experience. • Experience building and operating online services and fault-tolerant distributed systems at internet scale. • Demonstrable experience shipping software, internet scale services using GraphQL/REST API(s) on Microsoft Azure and/or Amazon Web Services(AWS) cloud. • Experience writing code in C++/C#/Java using agile and test-driving development (TDD). • 3+ years in cloud service development - Azure or AWS services. Preferred Qualifications • Excellent verbal and written communications skills (to engage with both technical and non-technical stakeholders at all levels). • Familiarity with Extract Transform Load (ETL) Pipelines, Data Modelling, Data Engineering and past ML experience is a plus. • Experience in Data Bricks and/or Microsoft Fabric will be an added plus. • Hands-on experience using distributed computing platforms like Apache Spark, Apache Flink Apache Kafka or Azure EventHub.
    $125k-176k yearly est. 1d ago
  • Senior UX Developer

    Centific

    Data engineer job in Bellevue, WA

    Role: UX Developer - Frontend Systems & Design Stack: TypeScript, Vite, React Employment Type: Full-Time About Your Role We're looking for a UX-focused Frontend Developer who can bring clean, scalable, and intuitive interfaces to life, ith minimal design oversight. Our mission is deeply impactful, and your work will shape how operators interact with AI and simulation systems designed to safeguard our country. You'll help define reusable component systems and build UX patterns that withstand complexity. This role is perfect for someone who codes with craft and cares about every user interaction, from micro-interactions to performance. Build the interface to the future. Your code will not only enable national security but drive how humans collaborate with AI in critical missions. Key Responsibilities Build and maintain reusable, high-quality UI components Translate complex agent behaviours and simulation results into usable interfaces Own frontend architecture decisions and interaction paradigms Collaborate closely with backend and AI teams to integrate data-rich UIs Ideal Candidate: Deep expertise in TypeScript and modern frontend tooling (Vite, React) Passion for interaction design and performance at scale Self-directed with strong product instincts and attention to UX details Eager to work on purpose-driven, impactful technology Benefits: Comprehensive healthcare, dental, and vision coverage 401k plan Paid time off (PTO) And more! Company Overview: Centific is a frontier AI data foundry that curates diverse, high-quality data, using our purpose-built technology platforms to empower the Magnificent Seven and our enterprise clients with safe, scalable AI deployment. Our team includes more than 150 PhDs and data scientists, along with more than 4,000 AI practitioners and engineers. We harness the power of an integrated solution ecosystem-comprising industry-leading partnerships and 1.8 million vertical domain experts in more than 230 markets-to create contextual, multilingual, pre-trained datasets; fine-tuned, industry-specific LLMs; and RAG pipelines supported by vector databases. Our zero-distance innovation™ solutions for GenAI can reduce GenAI costs by up to 80% and bring solutions to market 50% faster. Our mission is to bridge the gap between AI creators and industry leaders by bringing best practices in GenAI to unicorn innovators and enterprise customers. We aim to help these organizations unlock significant business value by deploying GenAI at scale, helping to ensure they stay at the forefront of technological advancement and maintain a competitive edge in their respective markets. Learn more about us at www. centific.com Centific is an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, ancestry, citizenship status, age, mental or physical disability, medical condition, sex (including pregnancy), gender identity or expression, sexual orientation, marital status, familial status, veteran status, or any other characteristic protected by applicable law. We consider qualified applicants regardless of criminal histories, consistent with legal requirements.
    $110k-152k yearly est. 4d ago
  • Senior Frontend Developer

    Luxoft

    Data engineer job in Seattle, WA

    We are supporting long term strategic initiative to modernize and scale Frontend technology stack for industrial platform - a set of enterprise web and mobile applications in the electric utility space. This permanent full time role focuses on designing and evolving a composable user interface architecture with strong emphasis on reusability, performance, and future-ready web standards. Responsibilities: Lead the definition and evolution of UI architecture for web and mobile applications. Provide technical and architectural guidance to development teams in an agile environment. Build and maintain core UI frameworks and component libraries (Angular-centric). Collaborate with backend and data teams to optimize API usage and performance. Establish UI patterns, coding standards, and governance models. Mentor engineers on architectural best practices and performance optimization. Work closely with UX designers to ensure design-to-implementation alignment. Drive adoption of modern UI technologies, including micro-frontends and low-code/no-code configuration paradigms. Mandatory Skills Description: Extensive experience with UI architecture and component library development (Angular and modern frontend frameworks). Strong mastery of JavaScript/TypeScript and responsive, scalable frontend design. Hands-on experience with micro-frontend patterns and composable UIs. Familiarity with GraphQL integration and performance optimization techniques. Deep understanding of responsive design, cross-browser compatibility, performance tuning, and accessibility (e.g., WCAG). Minimum ~10+ years of relevant experience in UI/software engineering and architecture. Nice-to-Have Skills Description: Experience with other frontend ecosystems (React, React Native). Knowledge of SSR/SSG techniques and advanced bundling strategies (e.g., Module Federation). Familiarity with CI/CD, containerization, and cloud-based delivery models. Prior involvement with design systems and scalable UI component governance
    $110k-152k yearly est. 4d ago

Learn more about data engineer jobs

How much does a data engineer earn in Puyallup, WA?

The average data engineer in Puyallup, WA earns between $80,000 and $151,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Puyallup, WA

$110,000
Job type you want
Full Time
Part Time
Internship
Temporary