Post job

Data engineer jobs in Washington

- 3,822 jobs
  • Data Engineer II: 25-07190

    Akraya, Inc. 4.0company rating

    Data engineer job in Seattle, WA

    Process Skills: Data Pipeline (Proficient), SQL (Expert), Python(Proficient), ETL (Expert), QuickSight (Intermediate) Contract Type: W2 Only Duration: 5+ months (high possibility of conversion) Pay Range: $ 65.00- $ 70.00 Per Hour #LP Summary: Join the Sustain.AI team as a Data Engineer to lead the pivotal migration of sustainability data to modern AWS infrastructure, fostering a net-zero future. This role involves transforming and automating sustainability workflows into data-driven processes, significantly impacting our environmental goals. Your work will directly contribute to reducing the carbon footprint by creating an automated, scalable data infrastructure and ensuring precise data quality and consistency. Key Responsibilities Migrate ETL jobs from legacy to AWS, ensuring no operational disruptions. Set up and manage AWS data services (Redshift, Glue) and orchestrate workflows. Transform existing workflows into scalable, efficient AWS pipelines with robust validation. Collaborate with various teams to understand and fulfill data requirements for sustainability metrics. Document new architecture, implement data quality checks, and communicate with stakeholders on progress and challenges. Must-Have Skills: Advanced proficiency in ETL Pipeline and AWS data services (Redshift, S3, Glue). Expertise in SQL and experience with Python or Spark for data transformation. Proven experience in overseeing data migrations with minimal disruption. Industry Experience Required: Experience in sustainability data, carbon accounting, or environmental metrics is highly preferred. Familiarity with large-scale data infrastructure projects in complex enterprise environments is essential. About Akraya Akraya is an award-winning IT staffing firm consistently recognized for our commitment to excellence and a thriving work environment. Most recently, we were recognized Inc's Best Workplaces 2024 and Silicon Valley's Best Places to Work by the San Francisco Business Journal (2024) and Glassdoor's Best Places to Work (2023 & 2022)! Industry Leaders in IT Staffing As staffing solutions providers for Fortune 100 companies, Akraya's industry recognitions solidify our leadership position in the IT staffing space. We don't just connect you with great jobs, we connect you with a workplace that inspires! Join Akraya Today! Let us lead you to your dream career and experience the Akraya difference. Browse our open positions and join our team!
    $65-70 hourly 5d ago
  • Flight Data Translation Engineer

    Boeing 4.6company rating

    Data engineer job in Seattle, WA

    At Boeing, we innovate and collaborate to make the world a better place. We're committed to fostering an environment for every teammate that's welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. We are Boeing Global Services (BGS) Engineering team creating and implementing innovative technologies that make the impossible possible and enabling the future of aerospace. We provide engineering design and support, including aftermarket modifications, and are innovating to make product and services safety even stronger. Join us and put your passion, determination, and skill to work building the future! #TheFutureIsBuiltHere #ChangeTheWorld Boeing Seattle is seeking a Flight Data Engineer in Seattle, Washington to automate flight data translations within the Boeing Global Services, reporting to the Manager of Prognostics Development working out of the Seattle, WA office. The Flight Data Translation Engineer will be responsible for the delivery of cutting-edge flight data analytics products to commercial aviation customers. As part of an integrated product team, the successful candidate will be part of a technical team supporting product feature development and life support. In this role, the Flight Data Translation Engineer will work with a team of highly motivated aviation SMEs, software architects, developers, and data scientists. This role will work with portfolio leadership, development teams and stakeholders to establish and drive the implementation of flight data management for the success of each product and initiative. Position Responsibilities: Leverage the power of data and engagement insights to drive advancements in flight safety, fleet reliability and sustainability, while executing on the Boeing Flight Data Analytics value and vision Collaborate closely with engineering, DevOps, product management, and airline customers in an agile environment Ensure high quality aviation domain data is available for advanced aviation data insights that benefit both our internal teams and customers Gain a deep understanding of our customers, their flight operations, and their analytics needs; you will play a pivotal role in finding innovative solutions to meet their evolving needs Configure, test and validate aircraft safety, maintenance, sustainability and reliability data models for use in product development, and create high quality documentation required to support these tasks Define and validate experiments using Machine Learning, as assigned by Lead Data Architect, or Product Management Ensure data integrity across the entire data pipeline by leveraging expert-level knowledge of ARINC 717, 767 and other protocols, as applied to different aircraft types, including Boeing and Airbus Proactively identify and troubleshoot data integrity issues to ensure products are providing an exceptional customer experience Employ expertise in database and schema design, anticipating the data questions and data narratives that stakeholders may require Perform hands-on technical work, including working with data, building utilities and tooling, and providing flight data and integrated dataset support to both the product team and customers Work closely with the lead data architect and product managers to strategically shape the future direction of our analytics product Undertake other tasks, projects, and initiatives as needed This position must meet export control compliance requirements. To meet export control compliance requirements, a "U.S. Person" as defined by 22 C.F.R. §120.15 is required. "U.S. Person" includes U.S. Citizen, lawful permanent resident, refugee, or asylee. Basic Qualifications (Required Skills/Experience): Bachelor degree in engineering, computer science, mathematics, physics or chemistry 1+ years of progressive experience in software development and design Experienced working with aerospace systems ARINC standards (ARINC 717, 767) 5+ years of related work experience or an equivalent combination of technical education and experience Preferred Qualifications (Education/Experience): 9+ years of related work experience or an equivalent combination of technical education and experience Working experience with cloud solutions Experience with programming, eg java or python, SQL Experience in data science and/or data analytics Advanced experience in business intelligence tools, statistics and data modelling Experience working in a global technology organization and managing many stakeholders Experience with airline safety, airline maintenance engineering, avionics or aircraft flight data recording Experience with Airbus and Boeing aircraft operations Drug Free Workplace: Boeing is a Drug Free Workplace where post offer applicants and employees are subject to testing for marijuana, cocaine, opioids, amphetamines, PCP, and alcohol when criteria is met as outlined in our policies. Total Rewards and Pay Transparency: At Boeing, we strive to deliver a Total Rewards package that will attract, engage and retain the top talent. Elements of the Total Rewards package include competitive base pay and variable compensation opportunities. The Boeing Company also provides eligible employees with an opportunity to enroll in a variety of benefit programs, generally including health insurance, flexible spending accounts, health savings accounts, retirement savings plans, life and disability insurance programs, and a number of programs that provide for both paid and unpaid time away from work. The specific programs and options available to any given employee may vary depending on eligibility factors such as geographic location, date of hire, and the applicability of collective bargaining agreements. Pay is based upon candidate experience and qualifications, as well as market and business considerations. Summary pay range: Career $114,750 - $155,250 Expert $138,550 - $187,450 Applications for this position will be accepted until Dec. 15, 2025 Export Control Requirements: This position must meet export control compliance requirements. To meet export control compliance requirements, a "U.S. Person" as defined by 22 C.F.R. §120.15 is required. "U.S. Person" includes U.S. Citizen, lawful permanent resident, refugee, or asylee. Export Control Details: US based job, US Person required Education Bachelor's Degree or Equivalent Required Relocation This position offers relocation based on candidate eligibility. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift This position is for 1st shift Equal Opportunity Employer: Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law.
    $138.6k-187.5k yearly 1d ago
  • Staff Data Engineer

    Eton Solution 3.7company rating

    Data engineer job in Bellevue, WA

    *Immigration sponsorship is not available in this role* We are looking for an experienced Data Engineer (8+ years of experience) with deep expertise in Flink SQL to join our engineering team. This role is ideal for someone who thrives on building robust real-time data processing pipelines and has hands-on experience designing and optimizing Flink SQL jobs in a production environment. You'll work closely with data engineers, platform teams, and product stakeholders to create scalable, low-latency data solutions that power intelligent applications and dashboards. ⸻ Key Responsibilities: • Design, develop, and maintain real-time streaming data pipelines using Apache Flink SQL. • Collaborate with platform engineers to scale and optimize Flink jobs for performance and reliability. • Build reusable data transformation logic and deploy to production-grade Flink clusters. • Ensure high availability and correctness of real-time data pipelines. • Work with product and analytics teams to understand requirements and translate them into Flink SQL jobs. • Monitor and troubleshoot job failures, backpressure, and latency issues. • Contribute to internal tooling and libraries that improve Flink developer productivity. Required Qualifications: • Deep hands-on experience with Flink SQL and the Apache Flink ecosystem. • Strong understanding of event time vs processing time semantics, watermarks, and state management. • 3+ years of experience in data engineering, with strong focus on real-time/streaming data. • Experience writing complex Flink SQL queries, UDFs, and windowing operations. • Proficiency in working with streaming data formats such as Avro, Protobuf, or JSON. • Experience with messaging systems like Apache Kafka or Pulsar. • Familiarity with containerized deployments (Docker, Kubernetes) and CI/CD pipelines. • Solid understanding of distributed system design and performance optimization. Nice to Have: • Experience with other stream processing frameworks (e.g., Spark Structured Streaming, Kafka Streams). • Familiarity with cloud-native data stacks (AWS Kinesis, GCP Pub/Sub, Azure Event Hub). • Experience in building internal tooling for observability or schema evolution. • Prior contributions to the Apache Flink community or similar open-source projects. Why Join Us: • Work on cutting-edge real-time data infrastructure that powers critical business use cases. • Be part of a high-caliber engineering team with a culture of autonomy and excellence. • Flexible working arrangements with competitive compensation.
    $95k-134k yearly est. 3d ago
  • AWS Data Engineer

    Tata Consultancy Services 4.3company rating

    Data engineer job in Seattle, WA

    Must Have Technical/Functional Skills: We are seeking an experienced AWS Data Engineer to join our data team and play a crucial role in designing, implementing, and maintaining scalable data infrastructure on Amazon Web Services (AWS). The ideal candidate has a strong background in data engineering, with a focus on cloud-based solutions, and is proficient in leveraging AWS services to build and optimize data pipelines, data lakes, and ETL processes. You will work closely with data scientists, analysts, and stakeholders to ensure data availability, reliability, and security for our data-driven applications. Roles & Responsibilities: Key Responsibilities: • Design and Development: Design, develop, and implement data pipelines using AWS services such as AWS Glue, Lambda, S3, Kinesis, and Redshift to process large-scale data. • ETL Processes: Build and maintain robust ETL processes for efficient data extraction, transformation, and loading, ensuring data quality and integrity across systems. • Data Warehousing: Design and manage data warehousing solutions on AWS, particularly with Redshift, for optimized storage, querying, and analysis of structured and semi-structured data. • Data Lake Management: Implement and manage scalable data lake solutions using AWS S3, Glue, and related services to support structured, unstructured, and streaming data. • Data Security: Implement data security best practices on AWS, including access control, encryption, and compliance with data privacy regulations. • Optimization and Monitoring: Optimize data workflows and storage solutions for cost and performance. Set up monitoring, logging, and alerting for data pipelines and infrastructure health. • Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand data needs and deliver data solutions aligned with business goals. • Documentation: Create and maintain documentation for data infrastructure, data pipelines, and ETL processes to support internal knowledge sharing and compliance. Base Salary Range: $100,000 - $130,000 per annum TCS Employee Benefits Summary: Discretionary Annual Incentive. Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. Family Support: Maternal & Parental Leaves. Insurance Options: Auto & Home Insurance, Identity Theft Protection. Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement. Time Off: Vacation, Time Off, Sick Leave & Holidays. Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
    $100k-130k yearly 4d ago
  • GCP Data Engineer

    Insight Global

    Data engineer job in Seattle, WA

    Type: Contract-to-Perm | Duration: 6 Months We're seeking two experienced Senior Data Engineers to join our analytics reporting team and help build a robust data infrastructure. This role focuses on developing scalable ETL pipelines, designing data models, and supporting the migration of reporting tools. You'll work closely with analysts and cross-functional teams to transform business requirements into actionable data solutions within a Google Cloud Platform environment. Technical Environment Primary Cloud Platform: Google Cloud Platform (GCP) Data Warehouse & Staging: BigQuery Orchestration Tools: Airflow, Cloud Functions Reporting Tools: Looker (migration from Tableau) Day to Day Responsibilities Design, develop, and maintain ETL pipelines in BigQuery Build and optimize data models to support analytics and reporting Collaborate with analysts to translate business requirements into technical solutions Support migration of existing Tableau reports to Looker Develop data transformations and pipelines using Python and SQL Ensure data quality, scalability, and performance across all solutions Required Qualifications 4-5+ years of professional experience in data engineering Strong expertise in: Google Cloud Platform (GCP) BigQuery Python SQL (complex queries, stored procedures, joins, pivots) Proven experience in ETL development and dimensional modeling Preferred Qualifications Healthcare industry experience Familiarity with: Airflow Cloud Run Looker or Tableau reporting tools What Makes This Role Stand Out? Opportunity to work with cutting-edge cloud technologies in a healthcare setting Collaborative environment focused on innovation and data-driven decision-making Contract-to-perm pathway for long-term career growth $50.00 - $60.00 per hour Exact compensation may vary based on several factors, including skills, experience, and education. Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.
    $50-60 hourly 2d ago
  • ERP/MRP Data Architect

    Proliance Consulting

    Data engineer job in Issaquah, WA

    Our client is staffing a Data Architect to define the data strategy and architecture for a new enterprise, proprietary ERP/MRP. This person will be responsible for cross-platform oversight and governance of the holistic data landscape, which will be developed iteratively through many phases of feature and architectural foundation development. Location: Issaquah, WA Associate Vendors: We are accepting applications from candidates who are currently authorized to work in the US for any employer without sponsorship. Role & Responsibilities Drives the conceptual and logical data architecture strategy in collaboration with enterprise architecture. Defines and implements data migration strategies and solutions Advises teams on translating business needs into long-term data architecture solutions. Partners with feature and architectural foundation development teams to translate data architectures into physical models and implementations. Builds, optimizes, and maintains logical (curated) data models. Works with Data Engineering and Data Platform teams to conduct ongoing performance optimizations in the data model. Assists in defining data integration, storage, replication, and transformation protocols for the data layer of technical solutions supported by enterprise architecture. Creates data quality rules and validations to monitor critical business metrics/KPIs. Develops a thorough knowledge and understanding of cross system data flows as well as an enterprise view of Costco's data landscape. Maintains detailed documentation (i.e. data flow diagrams, source to target mappings, etc.). Required Qualifications Experience in implementing packaged or custom ERP/MRP solutions in the Retail industry, focusing on data architecture and migration 8+ years' experience in data architecture design, implementation, and maintenance of complex datasets. Experience working with data modeling tools (Erwin preferred). Experience architecting and delivering data engineering/integration pipelines. Experience designing and developing performance-optimized data models. Proficient in SQL. Desired Qualifications Exposure to the retail industry. Understanding of CI/CD pipelines, Azure DevOps, and GitHub Actions. Experience with Git for code storage and collaboration. Architectural level experience in information privacy, data compliance, and risk management. Excellent verbal and written communication skills. Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail About Proliance: Proliance Consulting is a Seattle-based staffing firm specializing in IT and digital marketing roles. We connect top-tier talent with leading companies across the United States, offering contract, contract-to-hire, and direct placement opportunities. Our approach emphasizes building authentic relationships, ensuring transparent communication, and delivering tailored solutions that align with both candidate aspirations and client needs. Committed to fostering an inclusive and equitable workplace, we prioritize continuous learning and cultural awareness to support diverse professionals in thriving environments. Why Work for Us? At Proliance Consulting, we prioritize the well-being and satisfaction of our consultants, which is reflected in our strong reputation and 5-star reviews on Google. We offer a comprehensive benefits package that includes medical and vision coverage through Regence, with Proliance covering 50% of employee premiums and 25% for dependents. We also provide dental insurance through Delta Dental, a competitive 401(k) with matching options, and direct deposit for convenience. Our team members enjoy paid safe and sick time, paperless pay statements, and the opportunity to earn referral bonuses through our Employee Referral Program. We believe in supporting our people both professionally and personally-because when you thrive, we thrive. Proliance Consulting provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Proliance Consulting complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfers, leaves of absence, compensation, and training.
    $93k-129k yearly est. 2d ago
  • Business Intelligence Engineer

    Intelliswift-An LTTS Company

    Data engineer job in Seattle, WA

    Pay rate range - $55/hr. to $60/hr. on W2 Onsite role Must Have - Expert Python and SQL Visualization and development Required Skills - 5-7 years of experience working with large-scale complex datasets - Strong analytical mindset, ability to decompose business requirements into an analytical plan, and execute the plan to answer those business questions - Strong working knowledge of SQL - Background (academic or professional) in statistics, programming, and marketing - SAS experience a plus - Graduate degree in math/statistics, computer science or related field, or marketing is highly desirable. - Excellent communication skills, equally adept at working with engineers as well as business leaders Daily Schedule - Evaluation of the performance of program features and marketing content along measures of customer response, use, conversion, and retention - Statistical testing of A/B and multivariate experiments - Design, build and maintain metrics and reports on program health - Respond to ad hoc requests from business leaders to investigate critical aspects of customer behavior, e.g. how many customers use a given feature or fit a given profile, deep dive into unusual patterns, and exploratory data analysis - Employ data mining, model building, segmentation, and other analytical techniques to capture important trends in the customer base - Participate in strategic and tactical planning discussions About the role understanding customer behavior is paramount to our success in providing customers with convenient, fast free shipping in the US and international markets. As a Senior Business Intelligence Engineer, you will work with our world-class marketing and technology teams to ensure that we continue to delight our customers. You will meet with business owners to formulate key questions, leverage client's vast Data Warehouse to extract and analyze relevant data, and present your findings and recommendations to management in a way that is actionable
    $55-60 hourly 2d ago
  • PAM Platform Engineer

    Tailored Management 4.2company rating

    Data engineer job in Seattle, WA

    Job Title: Privileged Access Management - Beyond Trust Engineer Duration: 06+ month contract (with possible extension) Pay Rate: $97.31/hr on W2 Benefits: Medical, Dental, Vision. Job Description: Summary: As a PAM Platform Engineer on Client's Identity & Access Management team, you'll be a key technical specialist responsible for designing, implementing, and maintaining our enterprise-wide Privileged Access Management infrastructure using Beyond Trust. You'll lead the rollout of Beyond Trust and support ongoing management of our privileged access solutions, including password management, endpoint privilege management, and session management capabilities across our retail technology ecosystem. Join our cybersecurity team to drive enterprise-level PAM adoption while maintaining Client's commitment to innovation, security excellence, and work-life balance. A day in the life... PAM Platform Leadership: Serve as the primary technical expert for privileged access management solutions, including architecture, deployment, configuration, and optimization of password vaults and endpoint privilege management systems Enterprise PAM Implementation: Design and execute large-scale PAM deployments across Windows, mac OS, and Linux environments, ensuring seamless integration with existing infrastructure Policy Development & Management: Create and maintain privilege elevation policies, credential rotation schedules, access request workflows, and governance rules aligned with security and compliance requirements Integration & Automation: Integrate PAM solutions with ITSM platforms, SIEM tools, vulnerability scanners, directory services, and other security infrastructure to create comprehensive privileged access workflows Troubleshooting & Support: Provide expert-level technical support for PAM platform issues, performance optimization, privileged account onboarding, and user access requests Security & Compliance: Ensure PAM implementations meet PCI DSS, and other requirements through proper audit trails, session recording and monitoring, and privileged account governance Documentation & Training: Develop technical documentation, procedures, and training materials for internal teams and end users Continuous Improvement: Monitor platform performance, evaluate new features, and implement best practices to enhance security posture and operational efficiency You own this if you have... Required Qualifications: 4-6+ years of hands-on experience implementing and managing Beyond Trust PAM at the Enterprise level. Beyond Trust certifications are preferred. Deep expertise in privileged account discovery, credential management, password rotation, session management, and access request workflows using Beyond Trust Strong understanding of Windows Server administration, Active Directory, Group Policy, and PowerShell scripting Experience with Linux/Unix system administration and shell scripting for cross-platform Beyond Trust PAM deployments Knowledge of networking fundamentals including protocols, ports, certificates, load balancing, and security hardening Experience with cloud platforms (AWS, Azure) and containerization technologies (Docker, Kubernetes) Understanding of identity and access protocols (SAML, OIDC, OAuth, SCIM, LDAP) and their integration with PAM solutions Preferred Qualifications: Knowledge of DevOps practices, CI/CD pipelines, and Infrastructure as Code (Terraform, Ansible) Familiarity with ITSM integration (ServiceNow, Jira) for ticket-driven privileged access workflows Experience with SIEM integration and security monitoring platforms (Splunk, QRadar, etc.) Understanding of zero trust architecture and least privilege access principles Experience with secrets management platforms (HashiCorp Vault, AWS Secrets Manager, Azure Key Vault) Previous experience in retail technology environments or large-scale enterprise deployments Industry certifications such as CISSP, CISM, or relevant cloud security certifications Technical Skills: PAM Platforms: Experience with BeyondTrust. Operating Systems: Windows Server (2016/2019/2022), Windows 10/11, mac OS, RHEL, Ubuntu, SUSE Databases: SQL Server, MySQL, PostgreSQL, Oracle for PAM backend configuration Virtualization: VMware vSphere, Hyper-V, cloud-based virtual machines Scripting: PowerShell, Bash, Python for automation and integration tasks Security Tools: Integration experience with vulnerability scanners, endpoint detection tools, and identity governance platforms Hiring a PAM Platform Engineer (BeyondTrust) to lead enterprise-level privileged access security and drive our next-gen IAM transformation!
    $97.3 hourly 2d ago
  • Software Dev Engineer

    Collabera 4.5company rating

    Data engineer job in Redmond, WA

    Title: Software Dev Engineer Required Skills & Qualifications 4-10 years of experience in software development. Strong proficiency in Python and backend development (APIs, business logic, integrations). Experience with AWS Lambda, DynamoDB, and serverless architecture. Hands-on experience with React for frontend development. Proficient in scripting (Python, Bash, or similar). Experience working with databases: Preferred: DynamoDB Also accepted: SQL-based DBs or MongoDB Solid understanding of REST APIs, microservices, and cloud-based application design. Nice-to-Have Skills Experience with CI/CD pipelines (CodePipeline, GitHub Actions, Jenkins, etc.) Knowledge of infrastructure-as-code tools such as CloudFormation, AWS CDK, or other IaC frameworks. Familiarity with containerization (Docker) is a plus. Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: medical insurance, dental insurance, vision insurance, 401(k) retirement plan, life insurance, long-term disability insurance, short-term disability insurance, paid parking/public transportation, (paid time , paid sick and safe time , hours of paid vacation time, weeks of paid parental leave, paid holidays annually - As Applicable)
    $103k-141k yearly est. 3d ago
  • SAP GRC Engineer

    Summit Group Solutions, LLC 4.4company rating

    Data engineer job in Issaquah, WA

    The SAP GRC Engineer supports the values and business goals as they relate to legal, ethical, and regulatory obligations; protect privacy; and maintain a secure technology environment. SAP GRC Engineers develop and execute security controls, defenses, and countermeasures to intercept and prevent internal/external attacks, infiltration of company data, and compromising of systems and accounts. SAP GRC Engineers research attempted/successful efforts to compromise systems security; design countermeasures; implement and maintain physical, technical, and administrative security controls; and provide information to management regarding the negative impact to the business. The SAP GRC Engineer is responsible for the creation and maintenance of General IT control objectives in the area of SAP GRC. This position will be responsible for ensuring that all SAP GRC IT control objectives are in compliance and running to full efficiency. In addition, this role will assist with the daily and monthly reporting of SOD (Segregation of Duties) activities from SAP GRC in support of meeting applicable compliance objectives. This is a cross-functional role, working closely with the SAP Security team and other functional teams to ensure security requirements and solutions meet compliance objectives. ROLE Provides GRC, security, and technical expertise to support the development of GRC objects to satisfy business requirements. Analyzes and administers GRC policies to control physical and virtual system access. Identifies and investigates GRC issues and develops solutions that address compliance requirements that can/do impact GRC and security. Identifies, develops, and implements mechanisms to detect incidents in order to enhance compliance and support of the standards and procedures. Assesses business role requirements, reviews authorization roles, and supports authorizations. Demonstrates a comprehensive skill set with testing authorizations for multiple environments and coordinates testing with business/technical users. Validates system configurations to ensure the safety of information systems assets and protects information systems from intentional or inadvertent access or destruction. Implements best practice when applying knowledge of information systems security standards/practices (e.g. access control and system hardening, system audit and log file monitoring, security policies, and incident handling). Identifies GRC gaps that expose Costco to potential exploit and develop short- and long-term prioritized remediation to address those gaps. Determines strategy and protocol for network behavior, analysis techniques, and tool implementation. Creates dashboards, configures alerts, implements and supports security software platforms, and monitors tools/apps. Identifies opportunities for streamlining and increasing effectiveness through continuous process improvement. Implements practices, processes, and procedures consistent with Costco's information security policy and IT standards. Develops and documents GRC events and incident handling procedures into Playbooks. Ensures that incident documentation is comprehensive, accurate, and complete. Triages, prioritizes, investigates, and coordinates security events and incident handling activities. Creates and/or remediates GITC (General IT Controls) in support of meeting audit objectives for all SAP modules and their supporting Databases, within the company SAP landscape (i.e. Finance, Retail, Warehouse Management, Payroll, HANA, etc.). Designs IT testing procedures to identify and evaluate risk exposures and determine the effectiveness and efficiency of controls. Assists with the creation of effective remediation solutions and/or exception documentation where applicable. Serves as the subject matter expert and point of contact to Internal and External Auditors. Assists project teams with creation and implementation of IT controls objectives and integration into SAP-GRC. Assists with the successful completion of the quarterly UAR (User Access Review) audit process. Collaborates with Internal Audit in developing, testing, and devising solutions to effectively meet applicable IT control objectives. Takes responsibility for continued personal growth in the areas of technology, business knowledge, Costco policies, and platforms. Participates in team activities and team planning in regards to improving team skills, awareness, and quality of work. REQUIRED Minimum of 12 years of experience of SAP GRC Access 10.0 and or 12.0 with expertise using the following modules: Account Request Management (ARM), Access Risk Analysis (ARA), Emergency Access Management (EAM), User Access Review (UAR), Process Control (PC), SAP ETD. Minimum of 7 years work experience in IT Risk Management, SOX compliance, and/or auditing with a strong background in IT controls. Minimum of 7 years of experience with SAP Security across various applications, including but not limited to, S/4 HANA, ECC, BW, MDG, Fiori, PI/PO, eWM, and Solution Manager. Minimum of 7 years experience with SOD conflict resolution. Direct “hands-on” experience in IT audits and functional experience using SAP GRC. Understanding of SAP cloud security. Strong understanding of Sarbanes-Oxley (SOX) and other compliance requirements that may impact controls. Expertise in working with internal and external auditors. Experience developing SAP GRC solutions that address Sarbanes-Oxley requirements. Effective communication and technical leadership; ability to fluently speak both technical and business language interchangeably. Ability to effectively mentor other team members on SAP compliance. Experience in successful project implementation and follow-up; strong time management skills. Strong conceptual, analytical, problem-solving, troubleshooting, and resolution skills. Ability to monitor and manage the progress of tasks and work independently. Ability to design, develop, and maintain SAP user management and security architecture across SAP environments, including hands-on role design and build across a number of complex SAP applications and databases. Scheduling flexibility to meet the needs of the business, including 24x7 on call rotational support. Recommended Bachelor's degree in Accounting, Business, Information Technology, or Computer Science preferred. Documentation and presentation skills catered to a diverse technical and business audience. Technical knowledge of SAP landscapes and roadmaps. Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail. Required Documents Cover Letter Resume Pay Range- $150,000 - $180,000 DOE plus Bonus and Restricted Stock Units (RSU) Location: Hybrid onsite 3 days per week in Issaquah, WA
    $150k-180k yearly 3d ago
  • DevOps Engineer

    OSI Engineering 4.6company rating

    Data engineer job in Seattle, WA

    A globally leading consumer device company based in Seattle, WA is looking for a DevOps Engineer, Cloud Infrastructure to join their dynamic team! Job Responsibilities: • Manage Kubernetes clusters by performing improvements and regular maintenance. Perform cloud infrastructure operational tasks. • Perform Database administration tasks including migration, instrumenting of telemetry, performance monitoring, cost monitoring and consolidation. • CI/CD-focused projects that include implementing features for GitOps + software lifecycle tooling and processes. Required Skills:5 years of relevant experience 3-5 years experience with Kubernetes (configuration, operations, deployment),and related technologies: Helm, ArgoCD, GitOps. 3-5 years experience with AWS and database administration: Postgres, RDS (eg. Aurora) 3 years experience in Python + Bash scripting Operational Experience with cloud-based service infrastructure (DNS, loadbalancing, ingress, telemetry and logging) Type: Contract Duration: 9 months with extension Work Location: Seattle, WA (Remote) Pay range: $ 74.00 - $ 89.00 (DOE)
    $74-89 hourly 5d ago
  • Kettle Engineer

    Soho Square Solutions

    Data engineer job in Seattle, WA

    Job Title: Kettle Engineer Type: Contract We're seeking a Kettle Engineer to design, operate, and continuously improve our Pentaho Data Integration (PDI/Kettle) platform and data movement processes that underpin business‑critical workflows. You will own end‑to‑end lifecycle management-from environment build and configuration to orchestration, monitoring, and support-partnering closely with application teams, production operations, and data stakeholders. The ideal candidate combines strong hands‑on Kettle expertise with solid SQL, automation, and production support practices in a fast‑moving, highly collaborative environment. Primary Responsibilities Platform ownership: Install, configure, harden, and upgrade Kettle/PDI components (e.g., Spoon) across dev/prod. Process engineering: Migrate, re-engineer, and optimize jobs and transformations for existing transformations. Reliability & support: Document all workflows including ownership and escalation protocols. Knowledge transfer with Automation and Application Support teams. Observability: Implement proactive monitoring, logging, and alerting for the Kettle platform including all dependent processes. Collaboration: Partner with application, data, and infrastructure teams to deliver improvements to existing designs Required Qualifications Kettle/PDI expertise: Experience installing and configuring a Kettle instance (server and client tools, repositories, parameters, security, and upgrades). Kettle/PDI expertise: Experience creating, maintaining, and supporting Kettle processes (jobs/transformations, error handling, recovery, and performance tuning). 4+ years hands‑on SQL (writing/diagnosing and optimizing queries). Strong communication skills for both technical and non‑technical audiences; effective at documenting and sharing knowledge. Preferred Qualifications Experience integrating Kettle with cloud platforms (AWS and/or Azure); familiarity with containers or Windows/Linux server administration. Exposure to monitoring/observability stacks (e.g., DataDog, CloudWatch, or similar). Scripting/automation for operations (Python, PowerShell, or Bash); experience with REST APIs within Kettle. Background in financial services or other regulated/mission‑critical environments. Key Outcomes (First 90 Days) Stand up or validate a hardened Kettle environment with baseline monitoring and runbooks. Migrate at least two high‑value Kettle workflows using shared templates and standardized error handling.
    $87k-126k yearly est. 3d ago
  • C++ Software Engineer w/ radio frequency and signal processing

    Request Technology

    Data engineer job in Everett, WA

    NO SPOSORSHIP Sr. C++ Software Systems Engineer - radio frequency and signal processing SALARY: $165k - $205k plus 20% bonus LOCATION: EVERETT, WA 98204 - Must live within one hour drive to come into the office a couple times a month Strong radio frequency and signal processing background. You will develop radio frequency. CC++ extensive digital signal processing (DSP) and math background radio frequency (RF) windows networking and socket programming embedded software Solutions to ensure the efficient use of frequencies, long distance communications, monitoring and security communications intelligence applications, we improve communications and protect military forces and infrastructure around the world. This person will apply their strong radio frequency and signal processing background, and software development skills to meeting signal detection, identification, processing, geolocation and analysis challenges facing spectrum regulators, intelligence organizations and defense agencies around the globe. Perform QA testing and analysis of new hardware and software performance up to the system level. Develop automated QA test software and systems. Required Experience US Person or Permanent Resident Extensive experience in design, implementation and testing of complex realtime multithreaded software applications Extensive C/C++ software development experience (6+ years) Extensive Digital Signal Processing (DSP) and math background Radio Frequency (RF) theory and practice (propagation, antennas, receivers, signals, systems, etc.) RF Signals expertise, including signal modulation, demodulation, decoding and signal analysis techniques and tools Programming for Windows operating systems Networking and socket level programming Databases and database programming Ability to quickly learn and support a large existing C++ code base System QA testing, including developing and executing test plans and writing automated QA test programs Excellent communications skills Ability to write technical product documentation Preferred Knowledge, Skills, and Abilities SIGINT/COMINT/EW experience RF Direction Finding and Geolocation concepts, including AOA and TDOA Mapping concepts, standards, and programming Audio signal processing including analog and digital demodulation Drone signals and protocols (uplink and downlink including video) Experience operating commercial drones Full Motion Video (FMV) systems, including STANAG 4609, KLV Metadata, MPEG-2 Transport Stream, H.264/265 encoding Programming expertise: Highly proficient in C/C++ Multithreaded realtime processing Programming with Qt Programming in Python Embedded programming Realtime hardware control and data acquisition High performance graphics GUI design and programming Networking and socket level programming Databases and database programming (incl. SQL) XML and XML programming JSON and JSON programming API programming (developing and using) Software licensing AI concepts and programming Tools: RF Measurement equipment (VSA/spectrum analyzers, signal generators, and other electronic test equipment) Windows OS, including desktop, server and embedded variants Microsoft Visual Studio and TFS Qt Python Intel IPP InstallShield Postgres and Microsoft databases packages Experience with Visual Basic, MFC, C#, WPF/XAML and other Windows development tools/API's Linux OS 6+ years relevant work experience MSEE (or BSEE with extended relevant work experience) with emphasis on RF communication systems, Digital Signal Processing, and software
    $165k-205k yearly 2d ago
  • Software Engineer, AOSP IV

    Epitec 4.4company rating

    Data engineer job in Redmond, WA

    Job Title: Software Engineer, AOSP - Reality Labs Research Contract Duration: 12 months Work Arrangement: Onsite Our team aims to define the worldwide standard for extended reality (XR) interaction with unparalleled software that accelerates research and creates novel devices. Responsibilities You'll partner with researchers, firmware engineers and other software engineers to define and develop Android OS implementations for research devices The work will include driver, system services, and integrations with firmware from additional SoCs Develop software stacks to provide access and control of novel sensor streams both on- and off-device Serve as a link between the team and the researchers to help accelerate research progress and to inform our strategic plans to better align with future research needs Minimum Qualifications 5+ years work experience with C/C++/C# 5+ years work experience in AOSP development 5+ years experience with Java (or Kotlin) Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience Preferred Qualifications 2+ years experience developing software for games, autonomous vehicles, robotics or other high performance real-time environments Experience with high-bandwidth communication Experience with camera integration Experience with low level firmware and RTOS
    $124k-169k yearly est. 3d ago
  • Firmware Software Engineer IV

    Pyramid Consulting, Inc. 4.1company rating

    Data engineer job in Redmond, WA

    Immediate need for a talented Firmware Software Engineer IV. This is a 12 Months opportunity with long-term potential and is located in Redmond, WA (Onsite). Please review the job description below and contact me ASAP if you are interested. Job Diva ID: 25-94264 Pay Range: $85- $90 /hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location). Key Requirements and Technology Experience: - Key skills; “MIPI” , “Firmware” , "C", "Camera Develop firmware to integrate custom image sensors with an MCU 8 years' experience in Firmware or Embedded Software Development in C/C . Familiarity with MIPI C-PHY and image sensors. Experience with Zephyr OS, Embedded Linux or other RTOS. Our client is a leading Meta Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration. Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. By applying to our jobs, you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
    $85-90 hourly 1d ago
  • Firmware Software Engineer

    Vaco By Highspring

    Data engineer job in Redmond, WA

    Title: Firmware Software Engineer IV Employment Type: 12 Months Contract Client: Meta (prev Facebook) Pay Rate: $80/hr to $90/hr Our team aims to define the worldwide standard for extended reality (XR) interaction with unparalleled software that accelerates research and creates novel devices. Responsibilities You'll partner with researchers, firmware engineers and other software engineers to define and develop Android OS implementations for research devices The work will include driver, system services, and integrations with firmware from additional SoCs Develop software stacks to provide access and control of novel sensor streams both on- and off-device Serve as a link between the team and the researchers to help accelerate research progress and to inform our strategic plans to better align with future research needs Minimum qualifications 3+ years work experience with C/C++/C# 3+ years work experience in AOSP development 2+ years experience with Java (or Kotlin) Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience Preferred qualifications 2+ years experience developing software for games, autonomous vehicles, robotics or other high performance real-time environments Experience with high-bandwidth communication Experience with camera integration Experience with low level firmware and RTOS Desired Skills and Experience Minimum qualifications * 3+ years work experience with C/C++/C# * 3+ years work experience in AOSP development * 2+ years experience with Java (or Kotlin) * Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
    $80 hourly 1d ago
  • Software Engineer

    Robert Half 4.5company rating

    Data engineer job in Vancouver, WA

    Are you a skilled Software Developer ready to join our dynamic team in Vancouver, Washington. In this role, you will focus on building and enhancing e-commerce experiences for leading fitness brands. You will collaborate across departments to deliver high-quality software solutions that optimize user experience and ensure robust system architecture. Job Posting Overview: We are seeking a highly skilled full stack Shopify Plus Developer to join our team in creating world-class e-commerce solutions for a suite of renowned fitness brands. As part of our expanding IT team, you'll play a pivotal role in theme development, managing third-party integrations, and optimizing store operations. Collaborating with design, marketing, and operations, you will ensure a frictionless user journey while contributing to technical innovations that drive brand growth. Responsibilities: • Develop and maintain themes for Shopify Plus platforms, ensuring seamless integration and high performance. • Manage third-party app integrations and optimize store operations through innovative tools and solutions. • Collaborate with design, marketing, and operations teams to create a frictionless e-commerce experience. • Implement and maintain Shopify APIs, including Storefront and Admin, utilizing both REST and GraphQL. • Optimize site performance to improve speed, search engine rankings, and conversion rates. • Apply version control practices using Git and oversee deployment workflows. • Troubleshoot and manage Shopify store configurations, including products, collections, and settings. • Integrate analytics platforms and third-party tools to enhance functionality and data insights. • Stay updated on industry trends, including headless commerce frameworks, to drive innovation in development. "Build the future of fitness e-commerce! Are you ready to flex your Shopify Plus development skills in an innovative, fast-paced environment driving the digital transformation of global fitness brands? Join a talented IT team revolutionizing the online experience for industry-leading direct-to-consumer brands, where your expertise in Liquid, JavaScript, and automation will shape seamless, scalable, and high-quality user experiences." Why Join Us? Be part of a forward-thinking team transforming the e-commerce space for fitness brands. Enjoy a dynamic and collaborative work environment focused on innovation. Work on meaningful projects that improve user experiences globally. Requirements • Bachelor's degree in computer science, web development, or a related field, or equivalent practical experience. • Minimum of 5 years of experience in Shopify Plus development. • Proficiency in Liquid, JavaScript (ES6+), and Shopify's ecosystem. • Knowledge of Shopify architecture, including APIs, Flow, Webhooks, and theme structures. AWS micro services, CI/CD builds, SDLC process management, Agile • Experience with Git version control and deployment workflows. • Strong expertise in performance optimization for e-commerce platforms. • Familiarity with third-party app integrations and analytics tools. • Preferred experience with headless commerce frameworks such as Next.js, Hydrogen, or Remix.
    $121k-167k yearly est. 5d ago
  • BE Software Engineer (Block Storage)

    Bayside Solutions 4.5company rating

    Data engineer job in Seattle, WA

    Backend Software Engineer (Block Storage) W2 Contract Salary Range: $114,400 - $135,200 per year We are looking for collaborative, curious, and pragmatic Software Engineers to be part of this innovative team. You will be able to shape the product's features and architecture as it scales orders of magnitude. Being part of our Cloud Infrastructure organization opens the door to exerting cross-functional influence and making a more significant organizational impact. Requirements and Qualifications: Proficient with UNIX/Linux Coding skills in one or more of these programming languages: Rust, C++, Java or C# Experience with scripting languages (Bash, Python, Perl) Excellent knowledge of software testing methodologies & practices 2 years of professional software development experience Strong ownership and track record of delivering results Excellent verbal and written communication skills Bachelor's Degree in Computer Science, an engineering-related field, or equivalent related experience. Preferred Qualifications: Proficiency in Rust Experience with high-performance asynchronous IO systems programming Knowledge on distributed systems Desired Skills and Experience Proficient with UNIX/Linux, Rust, C++, Java, C#, Bash, Python, Perl, software testing methodologies, professional software development, strong ownership, results-driven delivery, excellent communication skills, computer science or engineering degree, high-performance asynchronous IO systems programming, distributed systems Bayside Solutions, Inc. is not able to sponsor any candidates at this time. Additionally, candidates for this position must qualify as a W2 candidate. Bayside Solutions, Inc. may collect your personal information during the position application process. Please reference Bayside Solutions, Inc.'s CCPA Privacy Policy at *************************
    $114.4k-135.2k yearly 3d ago
  • Azure and Google Cloud Engineer

    Net2Source (N2S

    Data engineer job in Issaquah, WA

    Role: Azure and Google Cloud Engineer Duration :12 Months We are seeking a highly skilled and experienced Azure and Google Cloud Engineer to join our team. The ideal candidate will be responsible for troubleshooting and managing Azure as well as Google Cloud solutions that drive business transformation. This role requires a deep understanding of Cloud services, strong architectural principles, and the ability to translate business requirements into scalable, reliable, and secure cloud solutions. Key Responsibilities: Technical Leadership: Lead and mentor a team of cloud engineers and developers, providing guidance on best practices and technical issues. Act as a subject matter expert for Azure, staying current with the latest services, tools, and best practices. o Develop comprehensive cloud architectures leveraging GCP services such as Compute Engine, Kubernetes Engine, BigQuery, Pub/Sub, Cloud Functions, and others. o Design scalable, secure, and cost-effective cloud solutions to meet business and technical requirements. Cloud Strategy and Roadmap: Define a long-term cloud strategy for the organization, including migration plans, optimization, and governance frameworks. Assess and recommend best practices for cloud-native and hybrid cloud solutions. Solution Implementation: Implement CI/CD pipelines, monitoring, and infrastructure-as-code (e.g., Terraform, Cloud Deployment Manager). Collaboration and Leadership: Work closely with development, operations, and business teams to understand requirements and provide technical guidance. Mentor junior team members and foster a culture of continuous learning and innovation. Performance Optimization: Optimize GCP services for performance, scalability, and cost efficiency. Monitor and resolve issues related to cloud infrastructure, applications, and services. Documentation and Reporting: Create and maintain technical documentation, architectural diagrams, and operational runbooks. Provide regular updates and reports to stakeholders on project progress, risks, and outcomes. Required Skills and Qualifications: • Education: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). Experience: 7+ years of experience in cloud architecture and 4+ years specifically with GCP. Proven expertise in designing and implementing large-scale cloud solutions. Experience with application modernization using containers, microservices, and serverless architectures. Technical Skills: Proficiency in GCP services (e.g., BigQuery, Cloud Spanner, Kubernetes Engine, Cloud Run, Dataflow). Strong experience with Infrastructure as Code (e.g., Terraform, Deployment Manager). Knowledge of DevOps practices, CI/CD pipelines, and container orchestration tools. Familiarity with databases (relational and NoSQL) and data pipelines. • Certifications: GCP certifications such as Professional Cloud Architect or Professional Data Engineer are highly preferred. AZURE : Solution Development and Deployment: Oversee the deployment, management, and maintenance of cloud applications. Automate deployment and configuration management processes using Infrastructure as Code (IaC) tools such as ARM templates, Terraform, or Azure Bicep. Develop disaster recovery and business continuity plans for cloud services. Collaboration and Communication: Collaborate with cross-functional teams including development, operations, and security to ensure seamless integration and operation of cloud systems. Communicate effectively with stakeholders to understand business requirements and provide cloud solutions that meet their needs. Performance and Optimization: Monitor and optimize the performance of cloud systems to ensure they meet service level agreements (SLAs). Implement cost management strategies to optimize cloud spending. Security and Compliance: Ensure that all cloud solutions comply with security policies and industry regulations. Implement robust security measures, including identity and access management (IAM), network security, and data protection. Continuous Improvement: Drive continuous improvement initiatives to enhance the performance, reliability, and scalability of cloud solutions. Participate in architecture reviews and provide recommendations for improvements. Technical Skills: Extensive experience with Google and Microsoft Azure services including but not limited to Azure Virtual Machines, Azure App Services, Azure Functions, Azure Kubernetes Service (AKS), and Azure SQL Database. Proficiency in Azure networking, storage, and database services Strong skills in Infrastructure as Code (IaC) tools such as ARM templates, Terraform, or Azure Bicep. Experience with continuous integration/continuous deployment (CI/CD) pipelines using Azure DevOps or other similar tools. Deep understanding of cloud security best practices and tools including Azure Security Center, Azure Key Vault, and Azure Policy. Familiarity with compliance frameworks such as GDPR, HIPAA, and SOC 2. Proficiency in scripting languages like PowerShell, Python, or Bash for automation tasks. Experience with configuration management tools like Ansible or Chef is a plus. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Master's degree preferred. Experience: Minimum of 5 years of experience in cloud architecture with a focus on Microsoft Azure. Proven track record of designing and deploying large-scale cloud solutions. Certifications: Relevant Azure certifications such as Microsoft Certified: Azure Solutions Architect Expert, Microsoft Certified: Azure DevOps Engineer Expert, or similar. Best Regards, Bismillah Arzoo (AB)
    $87k-125k yearly est. 1d ago
  • Senior Software Engineer (Azure Databricks, DLT Pipelines, Terraform Dev, CD/CI, Data Platform) Contract at Bellevue, WA

    Red Oak Technologies 4.0company rating

    Data engineer job in Bellevue, WA

    Senior Software Engineer (Azure Databricks, DLT Pipelines, Coding, CD/CI, Data Platform & Data Integration) Contract at Bellevue, WA Must Have Experience: Hands-on experience with Azure Databricks/DLT Pipelines (Delta Live Tables) Good programming skills - C#, Java or Python CI/CD experience Data platform/Data integration experience The Role / Responsibilities The Senior Software Engineer, is a hands-on engineer who works from design through implementation of large-scale systems that is data centric for the MA Platform. This is a thought leadership role in the Data Domain across all of Client's' Analytics, with the expectation that the candidate will demonstrate and propagate best practices and processes in software development. The candidate is expected to drive things on their own with minimal supervision from anyone. • Design, code, test, and develop features to support large-scale data processing pipelines, for our multi-cloud SaaS platform with good quality, maintainability, and end to end ownership. • Define and leverage data models to understand cost drivers, to create concrete action plans that address platform concerns on Data. Qualifications • 5+ years of experience in building and shipping production grade software systems or services, with one or more of the following: Distributed Systems, large-scale data processing, data storage, Information Retrieval and/or Data Mining, Machine Learning fundamentals. • BS/MS/ in Computer Science or equivalent industry experience. • Experience building and operating online services and fault-tolerant distributed systems at internet scale. • Demonstrable experience shipping software, internet scale services using GraphQL/REST API(s) on Microsoft Azure and/or Amazon Web Services(AWS) cloud. • Experience writing code in C++/C#/Java using agile and test-driving development (TDD). • 3+ years in cloud service development - Azure or AWS services. Preferred Qualifications • Excellent verbal and written communications skills (to engage with both technical and non-technical stakeholders at all levels). • Familiarity with Extract Transform Load (ETL) Pipelines, Data Modelling, Data Engineering and past ML experience is a plus. • Experience in Data Bricks and/or Microsoft Fabric will be an added plus. • Hands-on experience using distributed computing platforms like Apache Spark, Apache Flink Apache Kafka or Azure EventHub.
    $125k-176k yearly est. 4d ago

Learn more about data engineer jobs

Do you work as a data engineer?

What are the top employers for data engineer in WA?

Top 10 Data Engineer companies in WA

  1. Amazon

  2. Meta

  3. Microsoft

  4. Ernst & Young

  5. Oracle

  6. Tata Group

  7. CVS Health

  8. SpaceX

  9. Nordstrom

  10. The Walt Disney Company

Job type you want
Full Time
Part Time
Internship
Temporary

Browse data engineer jobs in washington by city

All data engineer jobs

Jobs in Washington