Post job

Data Engineer jobs at Rackspace

- 9705 jobs
  • Sr Big Data Engineer - Oozie and Pig (GCP)

    Rackspace 4.8company rating

    Data engineer job at Rackspace

    About the Role We are seeking a Senior Big Data Engineer with deep expertise in distributed systems, batch data processing, and large-scale data pipelines. The ideal candidate has strong hands-on experience with Oozie, Pig, the Apache Hadoop ecosystem, and programming proficiency in Java (preferred) or Python. This role requires a deep understanding of data structures and algorithms, along with a proven track record of writing production-grade code and building robust data workflows. This is a fully remote position and requires an independent, self-driven engineer who thrives in complex technical environments and communicates effectively across teams. Work Location: US-Remote, Canada-Remote Key Responsibilities: Design and develop scalable batch processing systems using technologies like Hadoop, Oozie, Pig, Hive, MapReduce, and HBase, with hands-on coding in Java or Python (Java is a must). Must be able to lead Jira Epics Write clean, efficient, and production-ready code with a strong focus on data structures and algorithmic problem-solving applied to real-world data engineering tasks. Develop, manage, and optimize complex data workflows within the Apache Hadoop ecosystem, with a strong focus on Oozie orchestration and job scheduling. Leverage Google Cloud Platform (GCP) tools such as Dataproc, GCS, and Composer to build scalable and cloud-native big data solutions. Implement DevOps and automation best practices, including CI/CD pipelines, infrastructure as code (IaC), and performance tuning across distributed systems. Collaborate with cross-functional teams to ensure data pipeline reliability, code quality, and operational excellence in a remote-first environment. Qualifications: Bachelor's degree in Computer Science, software engineering or related field of study. Experience with managed cloud services and understanding of cloud-based batch processing systems are critical. Must be able to lead Jira Epics is MUST Proficiency in Oozie, Airflow, Map Reduce, Java are MUST haves. Strong programming skills with Java (specifically Spark), Python, Pig, and SQL. Expertise in public cloud services, particularly in GCP. Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce. Familiarity with BigTable and Redis. Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes. Proven experience in engineering batch processing systems at scale. Must Have: (Important) 5+ years of experience in customer-facing software/technology or consulting. 5+ years of experience with “on-premises to cloud” migrations or IT transformations. 5+ years of experience building, and operating solutions built on GCP Proficiency in Oozie and Pig Must be able to lead Jira Epics Proficiency in Java or Python The following information is required by pay transparency legislation in the following states: CA, CO, HI, NY, and WA. This information applies only to individuals working in these states. · The anticipated starting pay range for Colorado is: $116,100 - $170,280. · The anticipated starting pay range for the states of Hawaii and New York (not including NYC) is: $123,600 - $181,280. · The anticipated starting pay range for California, New York City and Washington is: $135,300 - $198,440. Unless already included in the posted pay range and based on eligibility, the role may include variable compensation in the form of bonus, commissions, or other discretionary payments. These discretionary payments are based on company and/or individual performance and may change at any time. Actual compensation is influenced by a wide array of factors including but not limited to skill set, level of experience, licenses and certifications, and specific work location. Information on benefits offered is here. About Rackspace TechnologyWe are the multicloud solutions experts. We combine our expertise with the world's leading technologies - across applications, data and security - to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace TechnologyThough we're all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. #LI-VM1
    $135.3k-198.4k yearly Auto-Apply 28d ago
  • Software Engineer Staff - Electronic Warfare

    Lockheed Martin 4.8company rating

    Owego, NY jobs

    What We're Doing At Lockheed Martin Rotary and Mission Systems (RMS), we're leading the development of cutting-edge Airborne Electronic Warfare (EW) solutions. With over 45 years of experience and more than 3,000 systems fielded, we utilize the electromagnetic spectrum to sense, protect, and communicate, while disrupting adversary capabilities. Our advanced EW systems, including electronic support, electronic attack, and intelligence, serve as the "ears" of tactical aircraft, detecting and locating threats. We continue to innovate and advance EW technologies for various aircraft applications, ensuring our warfighters stay ahead of emerging threats. Electronic Warfare | Lockheed Martin. Lockheed Martin Converged Cyber & Electronic Warfare. The Work We are seeking an experienced Electronic Warfare (EW) Software Engineer to join our team. In this exciting role, you will have the opportunity to collaborate with industry-leading design professionals and contribute to groundbreaking projects. As an experienced engineering professional, your responsibilities will include: • Collaborate with other engineering disciplines and program personnel to develop and support Electronic Warfare systems. • Ensure program specific software standards are met. • Engage program management, customers, and suppliers as needed. • Development of Software for Electronic Attack/Electronic Warfare applications. Who we are Lockheed Martin is a global leader in aerospace, defense, and technology solutions. Our Owego campus is a thriving center of engineering expertise, fostering a culture that encourages creativity, excellence, and the creation of exceptional products. The Owego area offers a unique blend of natural beauty, cultural attractions, and a low cost of living, making it an ideal place to live. Finger Lakes. Village of Owego. Who you are To excel in this role, you should possess the following qualifications and attributes: • Strong leadership skills, with the ability to lead and inspire teams of engineers. • An innovative mindset, capable of finding solutions to complex engineering challenges. • A commitment to excellence, attention to detail, and a dedication to delivering high-quality results. A Software Engineer Staff position is typically a middle to senior career professional role. Why Join Us Your Health, Your Wealth, Your Life Joining Lockheed Martin means becoming part of a team that makes a significant impact in the field of engineering. When you choose to work with us, you'll enjoy: • An excellent working environment equipped with state-of-the-art design tools. • The opportunity to work alongside industry leaders and top-notch design professionals. • A chance to be part of solving some of the world's most challenging engineering problems. • A culture that encourages creativity, excellence, and the development of remarkable products. • A 4/10 flex schedule, with every Friday off, which provides a great balance between work and personal life. If you're ready to take your engineering career to the next level, work on groundbreaking projects, and be part of a team that thrives on innovation, we encourage you to apply and join our mission. Our flexible schedules, competitive pay, and comprehensive benefits enable you to live a healthy, fulfilling life both at and outside of work. Basic Qualifications: • Bachelor's degree or higher in Computer Engineering, Computer Science, or related discipline • Electronic Warfare, Communication (such as satellite communications systems, radio frequency systems, and other related technologies), and/or RADAR subsystem/system software development experience • Extensive experience using Java, C++, C#, Python, or similar programming languages • Experience with the Agile methodology. • Experience with debugging complex issues, as well as unit testing and integration testing. • Ability to obtain a US government security clearance. Desired Skills: • Experience with Electronic Support (ES) and/or Electronic Attack (EA) software development • Knowledge of Electronic Warfare techniques and algorithms. • Experience with modern DevOps tools and principles. • Experience leading a team of engineers • Proven experience in managing program financials • Working knowledge of Git and GitLab. • Experience with Ada programming language. • Skilled in developing and writing contract proposals. • Secret investigation within 5 years or Top Secret clearance Security Clearance Statement: This position requires a government security clearance, you must be a US Citizen for consideration. Clearance Level: Secret Other Important Information You Should Know Expression of Interest: By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings. Ability to Work Remotely: Onsite Full-time: The work associated with this position will be performed onsite at a designated Lockheed Martin facility. Work Schedules: Lockheed Martin supports a variety of alternate work schedules that provide additional flexibility to our employees. Schedules range from standard 40 hours over a five day work week while others may be condensed. These condensed schedules provide employees with additional time away from the office and are in addition to our Paid Time off benefits. Schedule for this Position: 4x10 hour day, 3 days off per week Pay Rate: The annual base salary range for this position in California, Massachusetts, and New York (excluding most major metropolitan areas), Colorado, Hawaii, Illinois, Maryland, Minnesota, New Jersey, Vermont, Washington or Washington DC is $113,900 - $200,905. For states not referenced above, the salary range for this position will reflect the candidate's final work location. Please note that the salary information is a general guideline only. Lockheed Martin considers factors such as (but not limited to) scope and responsibilities of the position, candidate's work experience, education/ training, key skills as well as market and business considerations when extending an offer. Benefits offered: Medical, Dental, Vision, Life Insurance, Short-Term Disability, Long-Term Disability, 401(k) match, Flexible Spending Accounts, EAP, Education Assistance, Parental Leave, Paid time off, and Holidays. (Washington state applicants only) Non-represented full-time employees: accrue at least 10 hours per month of Paid Time Off (PTO) to be used for incidental absences and other reasons; receive at least 90 hours for holidays. Represented full time employees accrue 6.67 hours of Vacation per month; accrue up to 52 hours of sick leave annually; receive at least 96 hours for holidays. PTO, Vacation, sick leave, and holiday hours are prorated based on start date during the calendar year. This position is incentive plan eligible. Pay Rate: The annual base salary range for this position in most major metropolitan areas in California, Massachusetts, and New York is $131,000 - $227,125. For states not referenced above, the salary range for this position will reflect the candidate's final work location. Please note that the salary information is a general guideline only. Lockheed Martin considers factors such as (but not limited to) scope and responsibilities of the position, candidate's work experience, education/ training, key skills as well as market and business considerations when extending an offer. Benefits offered: Medical, Dental, Vision, Life Insurance, Short-Term Disability, Long-Term Disability, 401(k) match, Flexible Spending Accounts, EAP, Education Assistance, Parental Leave, Paid time off, and Holidays. This position is incentive plan eligible. Lockheed Martin is an equal opportunity employer. Qualified candidates will be considered without regard to legally protected characteristics. The application window will close in 90 days; applicants are encouraged to apply within 5 - 30 days of the requisition posting date in order to receive optimal consideration. At Lockheed Martin, we use our passion for purposeful innovation to help keep people safe and solve the world's most complex challenges. Our people are some of the greatest minds in the industry and truly make Lockheed Martin a great place to work. With our employees as our priority, we provide diverse career opportunities designed to propel, develop, and boost agility. Our flexible schedules, competitive pay, and comprehensive benefits enable our employees to live a healthy, fulfilling life at and outside of work. We place an emphasis on empowering our employees by fostering an inclusive environment built upon integrity and corporate responsibility. If this sounds like a culture you connect with, you're invited to apply for this role. Or, if you are unsure whether your experience aligns with the requirements of this position, we encourage you to search on Lockheed Martin Jobs, and apply for roles that align with your qualifications. Experience Level: Experienced Professional Business Unit: RMS Relocation Available: Possible Career Area: Software Engineering Type: Full-Time Shift: First
    $131k-227.1k yearly 3d ago
  • Software Engineer Senior

    Lockheed Martin 4.8company rating

    Orlando, FL jobs

    What We're Doing Rotary and Mission Systems' Training, Logistics and Simulation (TLS) business is Lockheed Martin's center of excellence for training and logistics products and services, serving the U.S. military and more than 65 international customers around the world. Based in Orlando, TLS develops programs that teach service men and women skills to accomplish their most challenging missions - flying the world's most advanced fighter aircraft, navigating ships and driving armored vehicles. TLS is the corporation's hub for simulation, X reality, live-virtual-constructive capabilities, advanced training devices and full-service training programs. TLS also provides sustainment services such as supply chain and logistics IT solutions, spares and repairs, as well as automated test and support equipment. THE WORK This is a position for a Software Engineering Senior on our F35 Pilot Training Devices (PTD) Team. As a key member of our Software Engineering team, you will: • Design, develop, and maintain CI/CD pipelines using tools such as Jenkins, GitLab CI/CD, and Azure DevOps • Containerize applications using Docker and Podman • Develop and maintain scripts using languages such as Python, Bash, and PowerShell • Collaborate with development teams to ensure smooth integration of code changes into the CI/CD pipeline • Troubleshoot and resolve issues with the CI/CD pipeline, including debugging and optimizing pipeline performance • Ensure compliance with security and regulatory requirements, including implementing security scanning and vulnerability management tools • Develop and maintain documentation for CI/CD pipelines, including pipeline architecture, configuration, and troubleshooting guides. This position will require the selected candidate to have or obtain an Interim Secret level U.S. government security clearance before starting with Lockheed Martin. U.S. citizenship is a requirement for consideration. Why Join Us Join us if you are passionate about saving lives through mission readiness. Be a part of a team that values speed, agility, affordability, and disruptive technologies. If you are excited about transforming sustainment and training solutions and working with a talented team to reimagine the future, we invite you to contribute your skills and technical expertise to our mission. Basic Qualifications: • Bachelor's degree • 5 or more years of experience in software development, with a focus on full stack web development and DevOps • Experience developing, debugging, and maintaining GitLab CI/CD pipelines • Experience with containerization and using tools such as Docker or Podman • Experience with scripting in languages such as Bash, PowerShell, and Python • Experience with Infrastructure As Code (IaC) and writing Ansible playbooks • Experience with container orchestration via Kubernetes or Openshift • Strong experience with object-oriented programming languages (such as C++, C#, Python, Ruby, Objective-C) Desired Skills: • Master's degree • Advanced Expertise in GitLab CI/CD, including advanced pipeline configuration, job artifacts, and dependency management • Advanced Expertise with GitLab Runner, including installation, configuration, and management of runners • Advanced Expertise with Python and bash scripting Security Clearance Statement: This position requires a government security clearance, you must be a US Citizen for consideration. Clearance Level: Secret with Investigation or CV date within 5 years Other Important Information You Should Know Expression of Interest: By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings. Ability to Work Remotely: Part-time Remote Telework: The employee selected for this position will work part of their work schedule remotely and part of their work schedule at a designated Lockheed Martin facility. The specific weekly schedule will be discussed during the hiring process. Work Schedules: Lockheed Martin supports a variety of alternate work schedules that provide additional flexibility to our employees. Schedules range from standard 40 hours over a five day work week while others may be condensed. These condensed schedules provide employees with additional time away from the office and are in addition to our Paid Time off benefits. Schedule for this Position: 4x10 hour day, 3 days off per week Lockheed Martin is an equal opportunity employer. Qualified candidates will be considered without regard to legally protected characteristics. The application window will close in 90 days; applicants are encouraged to apply within 5 - 30 days of the requisition posting date in order to receive optimal consideration. At Lockheed Martin, we use our passion for purposeful innovation to help keep people safe and solve the world's most complex challenges. Our people are some of the greatest minds in the industry and truly make Lockheed Martin a great place to work. With our employees as our priority, we provide diverse career opportunities designed to propel, develop, and boost agility. Our flexible schedules, competitive pay, and comprehensive benefits enable our employees to live a healthy, fulfilling life at and outside of work. We place an emphasis on empowering our employees by fostering an inclusive environment built upon integrity and corporate responsibility. If this sounds like a culture you connect with, you're invited to apply for this role. Or, if you are unsure whether your experience aligns with the requirements of this position, we encourage you to search on Lockheed Martin Jobs, and apply for roles that align with your qualifications. Experience Level: Experienced Professional Business Unit: RMS Relocation Available: Possible Career Area: Software Engineering Type: Full-Time Shift: First
    $81k-102k yearly est. 3d ago
  • Snowflake Data Engineer (DBT SQL)

    Maganti It Resources, LLC 3.9company rating

    San Jose, CA jobs

    Job Description - Snowflake Data Engineer (DBT SQL) Duration: 6 months Key Responsibilities • Design, develop, and optimize data pipelines using Snowflake and DBT SQL. • Implement and manage data warehousing concepts, metadata management, and data modeling. • Work with data lakes, multi-dimensional models, and data dictionaries. • Utilize Snowflake features such as Time Travel and Zero-Copy Cloning. • Perform query performance tuning and cost optimization in cloud environments. • Administer Snowflake architecture, warehousing, and processing. • Develop and maintain PL/SQL Snowflake solutions. • Apply design patterns for scalable and maintainable data solutions. • Collaborate with cross-functional teams and tech leads across multiple tracks. • Provide technical and functional guidance to team members. Required Skills & Experience • Hands-on Snowflake development experience (mandatory). • Strong proficiency in SQL and DBT SQL. • Knowledge of data warehousing concepts, metadata management, and data modeling. • Experience with data lakes, multi-dimensional models, and data dictionaries. • Expertise in Snowflake features (Time Travel, Zero-Copy Cloning). • Strong background in query optimization and cost management. • Familiarity with Snowflake administration and pipeline development. • Knowledge of PL/SQL and SQL databases (additional plus). • Excellent communication, leadership, and organizational skills. • Strong team player with a positive attitude.
    $119k-167k yearly est. 1d ago
  • Optical Sensing, Hardware Data Analysis Engineer for a Global Consumer Device Company

    OSI Engineering 4.6company rating

    Cupertino, CA jobs

    Our optical sensing team develops optical sensors for next generation products. The team is seeking someone who has strong Python skills, a self-driven go-getter, with strong experience in optical instruments, data analysis and data visualization is required. Responsibilities: Manage and report the engineering build process using Python and JMP to analyze large sets of data and track key figures of merits. Validate the ambient light sensors' color and Liz sensing performance using Python and spectrometers. Assist with miscellaneous lab work to conduct failure analysis or research such as display light leakage, cover glass properties, affects from thermal environment, etc. Support in creating a performance simulation model using MATLAB. Lead end-to-end lab validation to support new optical sensor development. Develop and implement validation plan for hardware/software designs. Benchmark optical sensor performance from early prototype to product launch. Provide guidance and recommendation to production line testing requirements. Analyze data to draw conclusions and provide feedback to product design. Convert data to a visual plot and/or chart. Collaborate with cross-functional teams including Optical Engineering, Mechanical Engineering, Electrical Engineering and Process Engineering to deliver state-of-the-art sensing solutions. Deliver presentations of results in regular review with cross-functional teams. Requirements: Degree in Optics, Physics, Electrical Engineering or equivalent. B.S./M.S. and industry experience, or Ph.D. Strong background in optical measurements and data analysis. Experience in using Python or other coding languages for lab equipment control, data acquisition, and instrument automation. Need to be able to write/create, rewrite, revise customize and automate scripts. Hands-on experience with optical lab equipment (light sources, spectrometers, detectors, oscilloscopes, free space optics on optical bench, etc.). Excellent written and verbal communication skills. Solid teamwork and self-motivated for technical challenges. Preferred Skillset: Both Hardware and Software background Type: Contract (12+ months) Location: Cupertino, CA (100% onsite)
    $123k-175k yearly est. 5d ago
  • Data Engineer

    DL Software Inc. 3.3company rating

    New York, NY jobs

    DL Software produces Godel, a financial information and trading terminal. Role Description This is a full-time, on-site role based in New York, NY, for a Data Engineer. The Data Engineer will design, build, and maintain scalable data systems and pipelines. Responsibilities include data modeling, developing and managing ETL workflows, optimizing data storage solutions, and supporting data warehousing initiatives. The role also involves collaborating with cross-functional teams to improve data accessibility and analytics capabilities. Qualifications Strong proficiency in Data Engineering and Data Modeling Mandatory: strong experience in global financial instruments including equities, fixed income, options and exotic asset classes Strong Python background Expertise in Extract, Transform, Load (ETL) processes and tools Experience in designing, managing, and optimizing Data Warehousing solutions
    $91k-123k yearly est. 1d ago
  • Lead Data Engineer

    APN Consulting, Inc. 4.5company rating

    New York, NY jobs

    Job title: Lead Software Engineer Duration: Fulltime/Contract to Hire Role description: The successful candidate will be a key member of the HR Technology team, responsible for developing and maintaining global HR applications with a primary focus on HR Analytics ecosystem. This role combines technical expertise with HR domain knowledge to deliver robust data solutions that enable advanced analytics and data science initiatives. Key Responsibilities: Manage and support HR business applications, including problem resolution and issue ownership Design and develop ETL/ELT layer for HR data integration and ensure data quality and consistency Provide architecture solutions for Data Modeling, Data Warehousing, and Data Governance Develop and maintain data ingestion processes using Informatica, Python, and related technologies Support data analytics and data science initiatives with optimized data structures and AI/ML tools Manage vendor products and their integrations with internal/external applications Gather requirements and translate functional needs into technical specifications Perform QA testing and impact analysis across the BI ecosystem Maintain system documentation and knowledge repositories Provide technical guidance and manage stakeholder communications Required Skills & Experience: Bachelor's degree in computer science or engineering with 4+ years of delivery and maintenance work experience in the Data and Analytics space. Strong hands-on experience with data management, data warehouse/data lake design, data modeling, ETL Tools, advanced SQL and Python programming. Exposure to AI & ML technologies and experience tuning models and building LLM integrations. Experience conducting Exploratory Data Analysis (EDA) to identify trends and patterns, report key metrics. Extensive database development experience in MS SQL Server/ Oracle and SQL scripting. Demonstrable working knowledge of tools in CI/CD pipeline primarily GitLab and Jenkins Proficiency in using collaboration tools like Confluence, SharePoint, JIRA Analytical skills to model business functions, processes and dataflow within or between systems. Strong problem-solving skills to debug complex, time-critical production incidents. Good interpersonal skills to engage with senior stakeholders in functional business units and IT teams. Experience with Cloud Data Lake technologies such as Snowflake and knowledge of HR data model would be a plus.
    $93k-133k yearly est. 3d ago
  • Data Engineer

    Company 3.0company rating

    Fort Lee, NJ jobs

    The Senior Data Analyst will be responsible for developing MS SQL queries and procedures, building custom reports, and modifying ERP user forms to support and enhance organizational productivity. This role will also design and maintain databases, ensuring high levels of stability, reliability, and performance. Responsibilities Analyze, structure, and interpret raw data. Build and maintain datasets for business use. Design and optimize database tables, schemas, and data structures. Enhance data accuracy, consistency, and overall efficiency. Develop views, functions, and stored procedures. Write efficient SQL queries to support application integration. Create database triggers to support automation processes. Oversee data quality, integrity, and database security. Translate complex data into clear, actionable insights. Collaborate with cross-functional teams on multiple projects. Present data through graphs, infographics, dashboards, and other visualization methods. Define and track KPIs to measure the impact of business decisions. Prepare reports and presentations for management based on analytical findings. Conduct daily system maintenance and troubleshoot issues across all platforms. Perform additional ad hoc analysis and tasks as needed. Qualification Bachelor's Degree in Information Technology or relevant 4+ years of experience as a Data Analyst or Data Engineer, including database design experience. Strong ability to extract, manipulate, analyze, and report on data, as well as develop clear and effective presentations. Proficiency in writing complex SQL queries, including table joins, data aggregation (SUM, AVG, COUNT), and creating, retrieving, and updating views. Excellent written, verbal, and interpersonal communication skills. Ability to manage multiple tasks in a fast-paced and evolving environment. Strong work ethic, professionalism, and integrity. Advanced proficiency in Microsoft Office applications.
    $93k-132k yearly est. 3d ago
  • Azure Data Engineer

    Programmers.Io 3.8company rating

    Weehawken, NJ jobs

    · Expert level skills writing and optimizing complex SQL · Experience with complex data modelling, ETL design, and using large databases in a business environment · Experience with building data pipelines and applications to stream and process datasets at low latencies · Fluent with Big Data technologies like Spark, Kafka and Hive · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required · Designing and building of data pipelines using API ingestion and Streaming ingestion methods · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential · Experience in developing NO SQL solutions using Azure Cosmos DB is essential · Thorough understanding of Azure and AWS Cloud Infrastructure offerings · Working knowledge of Python is desirable · Designing and implementing scalable and secure data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services · Managing and optimizing data storage using Azure Data Lake Storage, Azure SQL Data Warehouse, and Azure Cosmos DB · Monitoring and troubleshooting data-related issues within the Azure environment to maintain high availability and performance · Implementing data security measures, including encryption, access controls, and auditing, to protect sensitive information · Automating data pipelines and workflows to streamline data ingestion, processing, and distribution tasks · Utilizing Azure's analytics services, such as Azure Synapse Analytics, to provide insights and support data-driven decision-making. · Documenting data procedures, systems, and architectures to maintain clarity and ensure compliance with regulatory standards · Providing guidance and support for data governance, including metadata management, data lineage, and data cataloging Best Regards, Dipendra Gupta Technical Recruiter *****************************
    $92k-132k yearly est. 5d ago
  • Sr Data Platform Engineer

    The Judge Group 4.7company rating

    Elk Grove, CA jobs

    Hybrid role 3X a week in office in Elk Grove, CA; no remote capabilities This is a direct hire opportunity. We're seeking a seasoned Senior Data Platform Engineer to design, build, and optimize scalable data solutions that power analytics, reporting, and AI/ML initiatives. This full‑time role is hands‑on, working with architects, analysts, and business stakeholders to ensure data systems are reliable, secure, and high‑performing. Responsibilites: Build and maintain robust data pipelines (structured, semi‑structured, unstructured). Implement ETL workflows with Spark, Delta Lake, and cloud‑native tools. Support big data platforms (Databricks, Snowflake, GCP) in production. Troubleshoot and optimize SQL queries, Spark jobs, and workloads. Ensure governance, security, and compliance across data systems. Integrate workflows into CI/CD pipelines with Git, Jenkins, Terraform. Collaborate cross‑functionally to translate business needs into technical solutions. Qualifications: 7+ years in data engineering with production pipeline experience. Expertise in Spark ecosystem, Databricks, Snowflake, GCP. Strong skills in PySpark, Python, SQL. Experience with RAG systems, semantic search, and LLM integration. Familiarity with Kafka, Pub/Sub, vector databases. Proven ability to optimize ETL jobs and troubleshoot production issues. Agile team experience and excellent communication skills. Certifications in Databricks, Snowflake, GCP, or Azure. Exposure to Airflow, BI tools (Power BI, Looker Studio).
    $108k-153k yearly est. 3d ago
  • Data Engineer

    Gotham Technology Group 4.5company rating

    New York, NY jobs

    Our client is seeking a Data Engineer with hands-on experience in Web Scraping technologies to help build and scale a new scraping capability within their Data Engineering team. This role will work directly with Technology, Operations, and Compliance to source, structure, and deliver alternative data from websites, APIs, files, and internal systems. This is a unique opportunity to shape a new service offering and grow into a senior engineering role as the platform evolves. Responsibilities Develop scalable Web Scraping solutions using AI-assisted tools, Python frameworks, and modern scraping libraries. Manage the full lifecycle of scraping requests, including intake, feasibility assessment, site access evaluation, extraction approach, data storage, validation, entitlement, and ongoing monitoring. Coordinate with Compliance to review Terms of Use, secure approvals, and ensure all scrapes adhere to regulatory and internal policy guidelines. Build and support AWS-based data pipelines using tools such as Cron, Glue, EventBridge, Lambda, Python ETL, and Redshift. Normalize and standardize raw, vendor, and internal datasets for consistent consumption across the firm. Implement data quality checks and monitoring to ensure the reliability, historical continuity, and operational stability of scraped datasets. Provide operational support, troubleshoot issues, respond to inquiries about scrape behavior or data anomalies, and maintain strong communication with users. Promote data engineering best practices, including automation, documentation, repeatable workflows, and scalable design patterns. Required Qualifications Bachelor's degree in Computer Science, Engineering, Mathematics, or related field. 2-5 years of experience in a similar Data Engineering or Web Scraping role. Capital markets knowledge with familiarity across asset classes and experience supporting trading systems. Strong hands-on experience with AWS services (S3, Lambda, EventBridge, Cron, Glue, Redshift). Proficiency with modern Web Scraping frameworks (Scrapy, BeautifulSoup, Selenium, Playwright). Strong Python programming skills and experience with SQL and NoSQL databases. Familiarity with market data and time series datasets (Bloomberg, Refinitiv) is a plus. Experience with DevOps/IaC tooling such as Terraform or CloudFormation is desirable.
    $86k-120k yearly est. 1d ago
  • Cloud Data Engineer

    Gotham Technology Group 4.5company rating

    New York, NY jobs

    Title: Enterprise Data Management - Data Cloud, Senior Developer I Duration: FTE/Permanent Salary: 130-165k The Data Engineering team oversees the organization's central data infrastructure, which powers enterprise-wide data products and advanced analytics capabilities in the investment management sector. We are seeking a senior cloud data engineer to spearhead the architecture, development, and rollout of scalable, reusable data pipelines and products, emphasizing the creation of semantic data layers to support business users and AI-enhanced analytics. The ideal candidate will work hand-in-hand with business and technical groups to convert intricate data needs into efficient, cloud-native solutions using cutting-edge data engineering techniques and automation tools. Responsibilities: Collaborate with business and technical stakeholders to collect requirements, pinpoint data challenges, and develop reliable data pipeline and product architectures. Design, build, and manage scalable data pipelines and semantic layers using platforms like Snowflake, dbt, and similar cloud tools, prioritizing modularity for broad analytics and AI applications. Create semantic layers that facilitate self-service analytics, sophisticated reporting, and integration with AI-based data analysis tools. Build and refine ETL/ELT processes with contemporary data technologies (e.g., dbt, Python, Snowflake) to achieve top-tier reliability, scalability, and efficiency. Incorporate and automate AI analytics features atop semantic layers and data products to enable novel insights and process automation. Refine data models (including relational, dimensional, and semantic types) to bolster complex analytics and AI applications. Advance the data platform's architecture, incorporating data mesh concepts and automated centralized data access. Champion data engineering standards, best practices, and governance across the enterprise. Establish CI/CD workflows and protocols for data assets to enable seamless deployment, monitoring, and versioning. Partner across Data Governance, Platform Engineering, and AI groups to produce transformative data solutions. Qualifications: Bachelor's or Master's in Computer Science, Information Systems, Engineering, or equivalent. 10+ years in data engineering, cloud platform development, or analytics engineering. Extensive hands-on work designing and tuning data pipelines, semantic layers, and cloud-native data solutions, ideally with tools like Snowflake, dbt, or comparable technologies. Expert-level SQL and Python skills, plus deep familiarity with data tools such as Spark, Airflow, and cloud services (e.g., Snowflake, major hyperscalers). Preferred: Experience containerizing data workloads with Docker and Kubernetes. Track record architecting semantic layers, ETL/ELT flows, and cloud integrations for AI/analytics scenarios. Knowledge of semantic modeling, data structures (relational/dimensional/semantic), and enabling AI via data products. Bonus: Background in data mesh designs and automated data access systems. Skilled in dev tools like Azure DevOps equivalents, Git-based version control, and orchestration platforms like Airflow. Strong organizational skills, precision, and adaptability in fast-paced settings with tight deadlines. Proven self-starter who thrives independently and collaboratively, with a commitment to ongoing tech upskilling. Bonus: Exposure to BI tools (e.g., Tableau, Power BI), though not central to the role. Familiarity with investment operations systems (e.g., order management or portfolio accounting platforms).
    $86k-120k yearly est. 1d ago
  • Data Engineer (Web Scraping technologies)

    Gotham Technology Group 4.5company rating

    New York, NY jobs

    Title: Data Engineer (Web Scraping technologies) Duration: FTE/Perm Salary: 125-190k plus bonus Responsibilities: Utilize AI Models, Code, Libraries or applications to enable a scalable Web Scraping capability Web Scraping Request Management including intake, assessment, accessing sites to scrape, utilizing tools to scrape, storage of scrape, validation and entitlement to users Fielding Questions from users about the scrapes and websites Coordinating with Compliance on approvals and TOU reviews Some Experience building Data pipelines in AWS platform utilizing existing tools like Cron, Glue, Eventbridge, Python based ETL, AWS Redshift Normalizing/standardizing vendor data, firm data for firm consumption Implement data quality checks to ensure reliability and accuracy of scraped data Coordinate with Internal teams on delivery, access, requests, support Promote Data Engineering best practices Required Skills and Qualifications: Bachelor's degree in computer science, Engineering, Mathematics or related field 2-5 experience in a similar role Prior buy side experience is strongly preferred (Multi-Strat/Hedge Funds) Capital markets experience is necessary with good working knowledge of reference data across asset classes and experience with trading systems AWS cloud experience with commons services (S3, lambda, cron, Event Bridge etc.) Experience with web-scraping frameworks (Scrapy, BeautifulSoup, Selenium, Playwright etc.) Strong hands-on skills with NoSQL and SQL databases, programming in Python, data pipeline orchestration tools and analytics tools Familiarity with time series data and common market data sources (Bloomberg, Refinitiv etc.) Familiarity with modern Dev Ops practices and infrastructure-as-code tools (e.g. Terraform, CloudFormation) Strong communication skills to work with stakeholders across technology, investment, and operations teams.
    $86k-120k yearly est. 1d ago
  • Senior Data Governance Consultant (Informatica)

    Paradigm Technology 4.2company rating

    Plano, TX jobs

    Senior Data Governance Consultant (Informatica) About Paradigm - Intelligence Amplified Paradigm is a strategic consulting firm that turns vision into tangible results. For over 30 years, we've helped Fortune 500 and high-growth organizations accelerate business outcomes across data, cloud, and AI. From strategy through execution, we empower clients to make smarter decisions, move faster, and maximize return on their technology investments. What sets us apart isn't just what we do, it's how we do it. Driven by a clear mission and values rooted in integrity, excellence, and collaboration, we deliver work that creates lasting impact. At Paradigm, your ideas are heard, your growth is prioritized, your contributions make a difference. Summary: We are seeking a Senior Data Governance Consultant to lead and enhance data governance capabilities across a financial services organization The Senior Data Governance Consultant will collaborate closely with business, risk, compliance, technology, and data management teams to define data standards, strengthen data controls, and drive a culture of data accountability and stewardship The ideal candidate will have deep experience in developing and implementing data governance frameworks, data policies, and control mechanisms that ensure compliance, consistency, and trust in enterprise data assets Hands-on experience with Informatica, including Master Data Management (MDM) or Informatica Data Management Cloud (IDMC), is preferred This position is Remote, with occasional travel to Plano, TX Responsibilities: Data Governance Frameworks: Design, implement, and enhance data governance frameworks aligned with regulatory expectations (e.g., BCBS 239, GDPR, CCPA, DORA) and internal control standards Policy & Standards Development: Develop, maintain, and operationalize data policies, standards, and procedures that govern data quality, metadata management, data lineage, and data ownership Control Design & Implementation: Define and embed data control frameworks across data lifecycle processes to ensure data integrity, accuracy, completeness, and timeliness Risk & Compliance Alignment: Work with risk and compliance teams to identify data-related risks and ensure appropriate mitigation and monitoring controls are in place Stakeholder Engagement: Partner with data owners, stewards, and business leaders to promote governance practices and drive adoption of governance tools and processes Data Quality Management: Define and monitor data quality metrics and KPIs, establishing escalation and remediation procedures for data quality issues Metadata & Lineage: Support metadata and data lineage initiatives to increase transparency and enable traceability across systems and processes Reporting & Governance Committees: Prepare materials and reporting for data governance forums, risk committees, and senior management updates Change Management & Training: Develop communication and training materials to embed governance culture and ensure consistent understanding across the organization Required Qualifications: 7+ years of experience in data governance, data management, or data risk roles within financial services (banking, insurance, or asset management preferred) Strong knowledge of data policy development, data standards, and control frameworks Proven experience aligning data governance initiatives with regulatory and compliance requirements Familiarity with Informatica data governance and metadata tools Excellent communication skills with the ability to influence senior stakeholders and translate technical concepts into business language Deep understanding of data management principles (DAMA-DMBOK, DCAM, or equivalent frameworks) Bachelor's or Master's Degree in Information Management, Data Science, Computer Science, Business, or related field Preferred Qualifications: Hands-on experience with Informatica, including Master Data Management (MDM) or Informatica Data Management Cloud (IDMC), is preferred Experience with data risk management or data control testing Knowledge of financial regulatory frameworks (e.g., Basel, MiFID II, Solvency II, BCBS 239) Certifications, such as Informatica, CDMP, or DCAM Background in consulting or large-scale data transformation programs Key Competencies: Strategic and analytical thinking Strong governance and control mindset Excellent stakeholder and relationship management Ability to drive organizational change and embed governance culture Attention to detail with a pragmatic approach Why Join Paradigm At Paradigm, integrity drives innovation. You'll collaborate with curious, dedicated teammates, solving complex problems and unlocking immense data value for leading organizations. If you seek a place where your voice is heard, growth is supported, and your work creates lasting business value, you belong at Paradigm. Learn more at ******************** Policy Disclosure: Paradigm maintains a strict drug-free workplace policy. All offers of employment are contingent upon successfully passing a standard 5-panel drug screen. Please note that a positive test result for any prohibited substance, including marijuana, will result in disqualification from employment, regardless of state laws permitting its use. This policy applies consistently across all positions and locations.
    $76k-107k yearly est. 2d ago
  • Data Engineer (AWS Redshift, BI, Python, ETL)

    Prosum 4.4company rating

    Manhattan Beach, CA jobs

    We are seeking a skilled Data Engineer with strong experience in business intelligence (BI) and data warehouse development to join our team. In this role, you will design, build, and optimize data pipelines and warehouse architectures that support analytics, reporting, and data-driven decision-making. You will work closely with analysts, data scientists, and business stakeholders to ensure reliable, scalable, and high-quality data solutions. Responsibilities: Develop and maintain ETL/ELT pipelines for ingesting, transforming, and delivering data. Design and enhance data warehouse models (star/snowflake schemas) and BI datasets. Optimize data workflows for performance, scalability, and reliability. Collaborate with BI teams to support dashboards, reporting, and analytics needs. Ensure data quality, governance, and documentation across all solutions. Qualifications: Proven experience with data engineering tools (SQL, Python, ETL frameworks). Strong understanding of BI concepts, reporting tools, and dimensional modeling. Hands-on experience with cloud data platforms (e.g., AWS, Azure, GCP) is a plus. Excellent problem-solving skills and ability to work in a cross-functional environment.
    $99k-139k yearly est. 5d ago
  • Data Engineer

    The Judge Group 4.7company rating

    Jersey City, NJ jobs

    ONLY LOCALS TO NJ/NY - NO RELOCATION CANDIDATES Skillset: Data Engineer Must Haves: Python, PySpark, AWS - ECS, Glue, Lambda, S3 Nice to Haves: Java, Spark, React Js Interview Process: Interview Process: 2 rounds, 2nd will be on site You're ready to gain the skills and experience needed to grow within your role and advance your career - and we have the perfect software engineering opportunity for you. As a Data Engineer III - Python / Spark / Data Lake at JPMorgan Chase within the Consumer and Community Bank , you will be a seasoned member of an agile team, tasked with designing and delivering reliable data collection, storage, access, and analytics solutions that are secure, stable, and scalable. Your responsibilities will include developing, testing, and maintaining essential data pipelines and architectures across diverse technical areas, supporting various business functions to achieve the firm's business objectives. Job responsibilities: • Supports review of controls to ensure sufficient protection of enterprise data. • Advises and makes custom configuration changes in one to two tools to generate a product at the business or customer request. • Updates logical or physical data models based on new use cases. • Frequently uses SQL and understands NoSQL databases and their niche in the marketplace. • Adds to team culture of diversity, opportunity, inclusion, and respect. • Develop enterprise data models, Design/ develop/ maintain large-scale data processing pipelines (and infrastructure), Lead code reviews and provide mentoring thru the process, Drive data quality, Ensure data accessibility (to analysts and data scientists), Ensure compliance with data governance requirements, and Ensure business alignment (ensure data engineering practices align with business goals). • Supports review of controls to ensure sufficient protection of enterprise data Required qualifications, capabilities, and skills • Formal training or certification on data engineering concepts and 2+ years applied experience • Experience across the data lifecycle, advanced experience with SQL (e.g., joins and aggregations), and working understanding of NoSQL databases • Experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis • Extensive experience in AWS, design, implementation, and maintenance of data pipelines using Python and PySpark. • Proficient in Python and PySpark, able to write and execute complex queries to perform curation and build views required by end users (single and multi-dimensional). • Proven experience in performance and tuning to ensure jobs are running at optimal levels and no performance bottleneck. • Advanced proficiency in leveraging Gen AI models from Anthropic (or OpenAI, or Google) using APIs/SDKs • Advanced proficiency in cloud data lakehouse platform such as AWS data lake services, Databricks or Hadoop, relational data store such as Postgres, Oracle or similar, and at least one NOSQL data store such as Cassandra, Dynamo, MongoDB or similar • Advanced proficiency in Cloud Data Warehouse Snowflake, AWS Redshift • Advanced proficiency in at least one scheduling/orchestration tool such as Airflow, AWS Step Functions or similar • Proficiency in Unix scripting, data structures, data serialization formats such as JSON, AVRO, Protobuf, or similar, big-data storage formats such as Parquet, Iceberg, or similar, data processing methodologies such as batch, micro-batching, or stream, one or more data modelling techniques such as Dimensional, Data Vault, Kimball, Inmon, etc., Agile methodology, TDD or BDD and CI/CD tools. Preferred qualifications, capabilities, and skills • Knowledge of data governance and security best practices. • Experience in carrying out data analysis to support business insights. • Strong Python and Spark
    $79k-111k yearly est. 1d ago
  • Data Engineer(python, Pyspark, data bricks)

    Anblicks 4.5company rating

    Dallas, TX jobs

    Job Title: Data Engineer(python, Pyspark, data bricks) Data Engineer with strong proficiency in SQL, Python, and PySpark to support high-performance data pipelines and analytics initiatives. This role will focus on scalable data processing, transformation, and integration efforts that enable business insights, regulatory compliance, and operational efficiency. Data Engineer - SQL, Python and Pyspark Expert (Onsite - Dallas, TX) Key Responsibilities Design, develop, and optimize ETL/ELT pipelines using SQL, Python, and PySpark for large-scale data environments Implement scalable data processing workflows in distributed data platforms (e.g., Hadoop, Databricks, or Spark environments) Partner with business stakeholders to understand and model mortgage lifecycle data (origination, underwriting, servicing, foreclosure, etc.) Create and maintain data marts, views, and reusable data components to support downstream reporting and analytics Ensure data quality, consistency, security, and lineage across all stages of data processing Assist in data migration and modernization efforts to cloud-based data warehouses (e.g., Snowflake, Azure Synapse, GCP BigQuery) Document data flows, logic, and transformation rules Troubleshoot performance and quality issues in batch and real-time pipelines Support compliance-related reporting (e.g., HMDA, CFPB) Required Qualifications 6+ years of experience in data engineering or data development Advanced expertise in SQL (joins, CTEs, optimization, partitioning, etc.) Strong hands-on skills in Python for scripting, data wrangling, and automation Proficient in PySpark for building distributed data pipelines and processing large volumes of structured/unstructured data Experience working with mortgage banking data sets and domain knowledge is highly preferred Strong understanding of data modeling (dimensional, normalized, star schema) Experience with cloud-based platforms (e.g., Azure Databricks, AWS EMR, GCP Dataproc) Familiarity with ETL tools, orchestration frameworks (e.g., Airflow, ADF, dbt)
    $75k-102k yearly est. 2d ago
  • Azure Data Engineer Sr

    Resolve Tech Solutions 4.4company rating

    Irving, TX jobs

    Minimum 7 years of relevant work experience in data engineering, with at least 2 years in a data modeling. Strong technical foundation in Python, SQL, and experience with cloud platforms (Azure,). Deep understanding of data engineering fundamentals, including database architecture and design, Extract, transform and load (ETL) processes, data lakes, data warehousing, and both batch and streaming technologies. Experience with data orchestration tools (e.g., Airflow), data processing frameworks (e.g., Spark, Databricks), and data visualization tools (e.g., Tableau, Power BI). Proven ability to lead a team of engineers, fostering a collaborative and high-performing environment.
    $76k-100k yearly est. 4d ago
  • Sr. Data Engineer (SQL+Python+AWS)

    SGS Technologie 3.5company rating

    Saint Petersburg, FL jobs

    looking for a Sr. Data Engineer (SQL+Python+AWS) to work on a 12+ Months, Contract (potential Extension or may Convert to Full-time) = Hybrid at St. Petersburg, FL 33716 with a Direct Financial Client = only on W2 for US Citizen or Green Card Holders. Notes from the Hiring Manager: • Setting up Python environments and data structures to support the Data Science/ML team. • No prior Data Science or Machine Learning experience required. • Role involves building new data pipelines and managing file-loading connections. • Strong SQL skills are essential. • Contract-to-hire position. • Hybrid role based in St. Pete, FL (33716) only. Duties: This role is building and maintaining data pipelines that connect Oracle-based source systems to AWS cloud environments, to provide well-structured data for analysis and machine learning in AWS SageMaker. It includes working closely with data scientists to deliver scalable data workflows as a foundation for predictive modeling and analytics. • Develop and maintain data pipelines to extract, transform, and load data from Oracle databases and other systems into AWS environments (S3, Redshift, Glue, etc.). • Collaborate with data scientists to ensure data is prepared, cleaned, and optimized for SageMaker-based machine learning workloads. • Implement and manage data ingestion frameworks, including batch and streaming pipelines. • Automate and schedule data workflows using AWS Glue, Step Functions, or Airflow. • Develop and maintain data models, schemas, and cataloging processes for discoverability and consistency. • Optimize data processes for performance and cost efficiency. • Implement data quality checks, validation, and governance standards. • Work with DevOps and security teams to comply with RJ standards. Skills: Required: • Strong proficiency with SQL and hands-on experience working with Oracle databases. • Experience designing and implementing ETL/ELT pipelines and data workflows. • Hands-on experience with AWS data services, such as S3, Glue, Redshift, Lambda, and IAM. • Proficiency in Python for data engineering (pandas, boto3, pyodbc, etc.). • Solid understanding of data modeling, relational databases, and schema design. • Familiarity with version control, CI/CD, and automation practices. • Ability to collaborate with data scientists to align data structures with model and analytics requirements Preferred: • Experience integrating data for use in AWS SageMaker or other ML platforms. • Exposure to MLOps or ML pipeline orchestration. • Familiarity with data cataloging and governance tools (AWS Glue Catalog, Lake Formation). • Knowledge of data warehouse design patterns and best practices. • Experience with data orchestration tools (e.g., Apache Airflow, Step Functions). • Working knowledge of Java is a plus. Education: B.S. in Computer Science, MIS or related degree and a minimum of five (5) years of related experience or combination of education, training and experience.
    $71k-91k yearly est. 2d ago
  • Python Data Engineer- THADC5693417

    Compunnel Inc. 4.4company rating

    Houston, TX jobs

    Must Haves: Strong proficiency in Python; 5+ years' experience. Expertise in Fast API and microservices architecture and coding Linking python based apps with sql and nosql db's Deployments on docker, Kubernetes and monitoring tools Experience with Automated testing and test-driven development Git source control, git actions, ci/cd , VS code and copilot Expertise in both on prem sql dbs (oracle, sql server, Postgres, db2) and no sql databases Working knowledge of data warehousing and ETL Able to explain the business functionality of the projects/applications they have worked on Ability to multi task and simultaneously work on multiple projects. NO CLOUD - they are on prem Day to Day: Insight Global is looking for a Python Data Engineer for one of our largest oil and gas clients in Downtown Houston, TX. This person will be responsible for building python-based relationships between back-end SQL and NoSQL databases, architecting and coding Fast API and Microservices, and performing testing on back-office applications. The ideal candidate will have experience developing applications utilizing python and microservices and implementing complex business functionality utilizing python.
    $78k-101k yearly est. 3d ago

Learn more about Rackspace jobs