Post job

Hadoop developer jobs near me

- 1,441 jobs
jobs
Let us run your job search
Sit back and relax while we apply to 100s of jobs for you - $25
  • Lead Software Engineer - UI/Mobile Development

    Quantum-Health 4.7company rating

    Hadoop developer job in Dublin, OH

    Location: This position is located at our Dublin, OH campus with hybrid flexibility or may work remotely anywhere in the United States of America. Who we are Founded in 1999 and headquartered in Central Ohio, we're a privately-owned, independent healthcare navigation organization. We believe that no one should have to navigate the cost and complexity of healthcare alone, and we're on a mission to make healthcare simpler and more effective for our millions of members. Our big-hearted, tech-savvy team fights to ensure that our members get the care they need, when they need it, at the most affordable cost - that's why we call ourselves Healthcare Warriors . We're committed to building diverse and inclusive teams - more than 2,000 of us and counting - so if you're excited about this position, we encourage you to apply - even if your experience doesn't match every requirement. About the role As a Lead Software Engineer, you'll guide the development of high-quality, user-facing web and mobile experiences while providing technical leadership across the full stack. You'll collaborate with product, design, and backend teams to build scalable solutions and mentor engineers across disciplines. Your primary focus will be leading front-end and mobile architecture using React and React Native, while ensuring seamless integration with backend systems and services. What you'll do (Essential Responsibilities) Lead front-end architecture and development using React, React Native, and TypeScript. Collaborate with backend teams (Java, Spring Boot) to design clean APIs and efficient data flows. Bridge UI/UX design and backend engineering, ensuring consistency, performance, and maintainability. Mentor both front-end and full-stack engineers, promoting modern development practices and code quality. Advocate for front-end excellence - performance optimization, accessibility, responsive design, and design system consistency. Contribute to full-stack architecture discussions, ensuring end-to-end technical alignment. Support DevOps and CI/CD best practices for seamless deployments. Stay current with emerging web and mobile technologies and identify opportunities for adoption. All other duties as assigned. What you'll bring (Qualifications) Education: Bachelor's or Master's degree in Information Technology, Computer Science, MIS, CIS, or a related field. Experience: 8+ years of professional experience in front-end engineering (React, React Native, TypeScript, JavaScript), along with backend development using Java (Spring Boot or similar frameworks). Strong understanding of full-stack development and backend systems (Java, REST APIs, databases). Proven expertise in building scalable, responsive, and high-performance UI components using ReactJS (React Native experience highly preferred). Familiarity with backend technologies (Java, Spring Boot, PostgreSQL, AWS) and architectural best practices. Deep expertise in UI architecture, responsive design, accessibility, and performance optimization. Experience working with design systems and component libraries such as Material UI or Ant Design. Hands-on experience with mobile development using React Native for iOS and Android (PWAs also a plus). Deep understanding of cloud architecture best practices, especially within AWS. Lead architecture and design for complex systems focused on scalability, performance, and security. Proficient in working with relational databases like PostgreSQL and/or SQL Server. Experience preparing and evaluating build/buy proposals and RFPs. Ability to define and drive technical vision for projects in alignment with strategic business goals. Mentor and support engineering teams to foster technical excellence and continuous improvement across front-end and full-stack domains. Collaborate with product managers, designers, and cross-functional stakeholders to deliver innovative, high-impact solutions. Stay up to date with emerging technologies and assess their value for potential adoption. Promote and enforce best practices for software engineering including code quality, test automation, and CI/CD processes. Lead efforts to troubleshoot and resolve critical technical issues, minimizing business disruption. Maintain accurate and up-to-date technical documentation for systems and processes. Excellent planning, organizational, and problem-solving skills with strong attention to detail. Strong verbal and written communication skills; ability to communicate effectively with both technical and non-technical stakeholders. Commitment to data security, privacy, and ethical technology practices. Must be legally authorized to work in the U.S. on a permanent and ongoing basis without requiring sponsorship. Protect and take care of our company and member's data every day by committing to work within our company ethics and policies. Trustworthy and accountable behavior, capable of viewing and maintaining confidential information daily. -- #LI-AK1 #LI-Hybrid #LI-Remote What's in it for you Compensation: Competitive base and incentive compensation Coverage: Health, vision and dental featuring our best-in-class healthcare navigation services, along with life insurance, legal and identity protection, adoption assistance, EAP, Teladoc services and more. Retirement: 401(k) plan with up to 4% employer match and full vesting on day one. Balance: Paid Time Off (PTO), 7 paid holidays, parental leave, volunteer days, paid sabbaticals, and more. Development: Tuition reimbursement up to $5,250 annually, certification/continuing education reimbursement, discounted higher education partnerships, paid trainings and leadership development. Culture: Recognition as a Best Place to Work for 15+ years, dedication to diversity, philanthropy and sustainability, and people-first values that drive every decision. Environment: A modern workplace with a casual dress code, open floor plans, full-service dining, free snacks and drinks, complimentary 24/7 fitness center with group classes, outdoor walking paths, game room, notary and dry-cleaning services and more! What you should know Internal Associates: Already a Healthcare Warrior? Apply internally through Jobvite. Process: Application > Phone Screen > Online Assessment(s) > Interview(s) > Offer > Background Check. Diversity, Equity and Inclusion: Quantum Health welcomes everyone. We value our diverse team and suppliers, we're committed to empowering our ERGs, and we're proud to be an equal opportunity employer . Tobacco-Free Campus: To further enable the health and wellbeing of our associates and community, Quantum Health maintains a tobacco-free environment. The use of all types of tobacco products is prohibited in all company facilities and on all company grounds. Compensation Ranges: Compensation details published by job boards are estimates and not verified by Quantum Health. Details surrounding compensation will be disclosed throughout the interview process. Compensation offered is based on the candidate's unique combination of experience and qualifications related to the position. Sponsorship: Applicants must be legally authorized to work in the United States on a permanent and ongoing future basis without requiring sponsorship. Agencies: Quantum Health does not accept unsolicited resumes or outreach from third-parties. Absent a signed MSA and request/approval from Talent Acquisition to submit candidates for a specific requisition, we will not approve payment to any third party. Reasonable Accommodation: Should you require reasonable accommodation(s) to participate in the application/interview/selection process, or in order to complete the essential duties of the position upon acceptance of a job offer, click here to submit a recruitment accommodation request. Recruiting Scams: Unfortunately, scams targeting job seekers are common. To protect our candidates, we want to remind you that authorized representatives of Quantum Health will only contact you from an email address ending **********************. Quantum Health will never ask for personally identifiable information such as Date of Birth (DOB), Social Security Number (SSN), banking/direct/tax details, etc. via email or any other non-secure system, nor will we instruct you to make any purchases related to your employment. If you believe you've encountered a recruiting scam, report it to the Federal Trade Commission and your state's Attorney General.
    $99k-120k yearly est. 5d ago
  • Junior Data Engineer

    Brooksource 4.1company rating

    Hadoop developer job in Columbus, OH

    Contract-to-Hire Columbus, OH (Hybrid) Our healthcare services client is looking for an entry-level Data Engineer to join their team. You will play a pivotal role in maintaining and improving inventory and logistics management programs. Your day-to-day work will include leveraging machine learning and open-source technologies to drive improvements in data processes. Job Responsibilities Automate key processes and enhance data quality Improve injection processes and enhance machine learning capabilities Manage substitutions and allocations to streamline product ordering Work on logistics-related data engineering tasks Build and maintain ML models for predictive analytics Interface with various customer systems Collaborate on integrating AI models into customer service Qualifications Bachelor's degree in related field 0-2 years of relevant experience Proficiency in SQL and Python Understanding of GCP/BigQuery (or any cloud experience, basic certifications a plus). Knowledge of data science concepts. Business acumen and understanding (corporate experience or internship preferred). Familiarity with Tableau Strong analytical skills Attitude for collaboration and knowledge sharing Ability to present confidently in front of leaders Why Should You Apply? You will be part of custom technical training and professional development through our Elevate Program! Start your career with a Fortune 15 company! Access to cutting-edge technologies Opportunity for career growth Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
    $86k-117k yearly est. 1d ago
  • Data Engineer (Remote USA ) G8

    Cisco 4.8company rating

    Remote hadoop developer job

    The application window is expected to close on: 25/12/25 Job posting may be removed earlier if the position is filled or if a sufficient number of applications are received. Meet the Team Join the Cisco IT Data team, where innovation, automation, and reliability drive extraordinary business outcomes. Our team delivers scalable, secure, and high-performance platforms supporting Cisco's global data operations. We value a culture of continuous improvement, collaboration, and technical excellence, empowering team members to experiment and drive operational transformation. Your Impact As a Data Operations (DevOps) Engineer, you will play a critical role in building, automating, and optimizing the infrastructure and processes that support the Corporate Functions - Enterprise Data Warehouse. Your expertise will ensure the reliability, scalability, and security of data platforms and pipelines across cloud and on-premise environments. You'll collaborate closely with data engineers, software engineers, architects, and business partners to create robust solutions that accelerate data-driven decision-making at Cisco. * Automate deployment, monitoring, and management of data platforms and pipelines using industry-standard DevOps tools and standard methodologies. * Build, implement, and maintain CI/CD pipelines for ETL, analytics, and data applications (e.g., Informatica, DBT, Airflow, Python, Java). * Ensure high availability, performance, and security of data systems in cloud (Snowflake, Google BigQuery, AWS/GCP/Azure) and hybrid environments. * Lead infrastructure as code (Terraform, CloudFormation, or similar) to provision and scale resources efficiently. * Implement observability and data quality monitoring using modern tools (e.g., Monte Carlo, Prometheus, Grafana, ELK). * Troubleshoot and resolve issues in production data & workflows, collaborating with engineering and analytics teams for root cause analysis and solution delivery. * Drive automation and process improvement for data operations, system upgrades, patching, and access management. * Contribute to security and compliance initiatives related to data governance, access controls, and audit readiness. Minimum Qualifications * Bachelor's or Master's degree in Computer Science, Engineering, or a related field. * 3-5 years of experience in DevOps, Data Operations, or related IT engineering roles. * 3-5 years of experience Proficiency with cloud platforms (Snowflake, AWS) and Working knowledge of ETL and workflow orchestration tools (Informatica, DBT, Airflow). * 3-5 years of experience Hands-on experience with CI/CD tools (Jenkins, GitLab CI, etc.), scripting (Python, Shell), and configuration management. *3-5 years of experience Working knowledge of ETL and workflow orchestration tools (Informatica, DBT, Airflow). * 3-5 years of experience Familiarity with infrastructure as code (Terraform, CloudFormation, etc.). * 3-5 years of experience Experience with monitoring, logging, and alerting solutions (Prometheus, Grafana, ELK, Monte Carlo, etc.). * 3-5 years of experience Familiarity with containerization and orchestration (Docker, Kubernetes). * Strong problem-solving skills, incident management. * Experience working in Agile/Scrum teams and delivering in fast-paced environments. Preferred Qualifications * Experience supporting data warehouse or analytics platforms in enterprise settings. * Knowledge of data quality, security, and governance frameworks. * Familiarity with automation tools and standard methodologies for operational efficiency. * Understanding of data pipelines, modeling, and analytics. * Excellent communication, collaboration, and documentation skills. **Why Cisco?** At Cisco, we're revolutionizing how data and infrastructure connect and protect organizations in the AI era - and beyond. We've been innovating fearlessly for 40 years to create solutions that power how humans and technology work together across the physical and digital worlds. These solutions provide customers with unparalleled security, visibility, and insights across the entire digital footprint. Fueled by the depth and breadth of our technology, we experiment and create meaningful solutions. Add to that our worldwide network of doers and experts, and you'll see that the opportunities to grow and build are limitless. We work as a team, collaborating with empathy to make really big things happen on a global scale. Because our solutions are everywhere, our impact is everywhere. We are Cisco, and our power starts with you. **Message to applicants applying to work in the U.S. and/or Canada:** The starting salary range posted for this position is $152,500.00 to $219,200.00 and reflects the projected salary range for new hires in this position in U.S. and/or Canada locations, not including incentive compensation*, equity, or benefits. Individual pay is determined by the candidate's hiring location, market conditions, job-related skillset, experience, qualifications, education, certifications, and/or training. The full salary range for certain locations is listed below. For locations not listed below, the recruiter can share more details about compensation for the role in your location during the hiring process. U.S. employees are offered benefits, subject to Cisco's plan eligibility rules, which include medical, dental and vision insurance, a 401(k) plan with a Cisco matching contribution, paid parental leave, short and long-term disability coverage, and basic life insurance. Please see the Cisco careers site to discover more benefits and perks. Employees may be eligible to receive grants of Cisco restricted stock units, which vest following continued employment with Cisco for defined periods of time. U.S. employees are eligible for paid time away as described below, subject to Cisco's policies: + 10 paid holidays per full calendar year, plus 1 floating holiday for non-exempt employees + 1 paid day off for employee's birthday, paid year-end holiday shutdown, and 4 paid days off for personal wellness determined by Cisco + Non-exempt employees** receive 16 days of paid vacation time per full calendar year, accrued at rate of 4.92 hours per pay period for full-time employees + Exempt employees participate in Cisco's flexible vacation time off program, which has no defined limit on how much vacation time eligible employees may use (subject to availability and some business limitations) + 80 hours of sick time off provided on hire date and each January 1st thereafter, and up to 80 hours of unused sick time carried forward from one calendar year to the next + Additional paid time away may be requested to deal with critical or emergency issues for family members + Optional 10 paid days per full calendar year to volunteer For non-sales roles, employees are also eligible to earn annual bonuses subject to Cisco's policies. Employees on sales plans earn performance-based incentive pay on top of their base salary, which is split between quota and non-quota components, subject to the applicable Cisco plan. For quota-based incentive pay, Cisco typically pays as follows: + .75% of incentive target for each 1% of revenue attainment up to 50% of quota; + 1.5% of incentive target for each 1% of attainment between 50% and 75%; + 1% of incentive target for each 1% of attainment between 75% and 100%; and + Once performance exceeds 100% attainment, incentive rates are at or above 1% for each 1% of attainment with no cap on incentive compensation. For non-quota-based sales performance elements such as strategic sales objectives, Cisco may pay 0% up to 125% of target. Cisco sales plans do not have a minimum threshold of performance for sales incentive compensation to be paid. The applicable full salary ranges for this position, by specific state, are listed below: New York City Metro Area: $152,500.00 - $252,000.00 Non-Metro New York state & Washington state: $135,800.00 - $224,400.00 * For quota-based sales roles on Cisco's sales plan, the ranges provided in this posting include base pay and sales target incentive compensation combined. ** Employees in Illinois, whether exempt or non-exempt, will participate in a unique time off program to meet local requirements. Cisco is an Affirmative Action and Equal Opportunity Employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation, national origin, genetic information, age, disability, veteran status, or any other legally protected basis. Cisco will consider for employment, on a case by case basis, qualified applicants with arrest and conviction records.
    $152.5k-252k yearly 30d ago
  • Principal Data Platform Engineer

    Motive 4.3company rating

    Remote hadoop developer job

    Who we are: Motive empowers the people who run physical operations with tools to make their work safer, more productive, and more profitable. For the first time ever, safety, operations and finance teams can manage their drivers, vehicles, equipment, and fleet related spend in a single system. Combined with industry leading AI, the Motive platform gives you complete visibility and control, and significantly reduces manual workloads by automating and simplifying tasks. Motive serves nearly 100,000 customers - from Fortune 500 enterprises to small businesses - across a wide range of industries, including transportation and logistics, construction, energy, field service, manufacturing, agriculture, food and beverage, retail, and the public sector. Visit gomotive.com to learn more. About the Job: As a Principal Data Platform Engineer, you will be involved and responsible for full ownership and driving key data platform initiatives and life cycle of the data management including data ingestion, data processing, data storage, querying system, cost reduction efforts towards delivering product features to internal and external Motive customers. We are looking for a technical leader in the Data Platform area who has built the full data ingestion, transformation and analytics systems over AWS and Kubernetes in multiple companies. This individual would have faced and solved multiple challenges affecting many feature areas during this process using the best practices. This individual will be responsible for contributing significantly to drive the Motive's Data Platform vision. The Data Platform team works in three areas: 1. Build scalable systems and services for data ingestion, access, processing and query to enable data driven product features. 2. Collaborate and work closely with the various stakeholders and our backend product teams to improve and add features to the platform. *This role is open to candidates in Central & East Coast time zones. What You'll Do: Work with other leaders in the Platform area to define and plan out the long term strategy for Data Platform. Design and develop scalable distributed systems and frameworks for data management Focus on addressing fault-tolerance and high availability issues, and work on scaling ingestion pipelines, improving and adding features to ETL framework while maintaining SLAs on performance, reliability, and system availability. Collaborate with engineers across teams to identify and deliver cross-functional features Participate in all aspects of the software development life cycle, from design to implementation and delivery. What We're Looking For: 8+ years Hands-on software engineering experience Backend programming skills including multi-threading, concurrency, etc and proficient in one or more of Python Strong CS fundamentals including data structures, algorithms, and distributed systems Experience in designing, implementing, and operating highly scalable software systems and services Experience building systems using technologies like Apache Kafka, Apache Spark, Airflow, Kubernetes Excellent troubleshooting skills and track record of implementing creative solutions Hands on experience with containerized platforms like Docker and Kubernetes BS in Computer Science or a related field; Masters preferred Excellent verbal and written skills. You collaborate effectively with other teams and communicate clearly about your work. Pay Transparency Your compensation may be based on several factors, including education, work experience, and certifications. For certain roles, total compensation may include restricted stock units. Motive offers benefits including health, pharmacy, optical and dental care benefits, paid time off, sick time off, short term and long term disability coverage, life insurance as well as 401k contribution (all benefits are subject to eligibility requirements). Learn more about our benefits by visiting Motive Perks & Benefits. The compensation range for this position will depend on where you reside. Motive uses three geographic zones to determine pay range. For this role, the compensation ranges are: San Francisco, California$189,000-$236,000 USDU.S. metropolitan areas: Los Angeles, San Diego, New York City Area, Seattle, Washington D.C.$181,000-$226,000 USDOther locations in the United States$164,000-$205,000 USD Creating a diverse and inclusive workplace is one of Motive's core values. We are an equal opportunity employer and welcome people of different backgrounds, experiences, abilities and perspectives. Please review our Candidate Privacy Notice here. UK Candidate Privacy Notice here. The applicant must be authorized to receive and access those commodities and technologies controlled under U.S. Export Administration Regulations. It is Motive's policy to require that employees be authorized to receive access to Motive products and technology. #LI-Remote
    $189k-236k yearly Auto-Apply 3d ago
  • Database developer Remote

    Lockheed Martin 4.8company rating

    Remote hadoop developer job

    Database developer to support front end systems (as needed by developers across the organization, in support of web services, third party, or internal development needs) to the exclusion of reporting needs by other departments. Developed code includes but is not limited to PL/SQL in the form of Triggers, Procedures, Functions, & Materialized Views. Generates custom driven applications for intra-department use for business users in a rapid application development platform (primarily APEX). Responsible for functional testing and deployment of code through the development life cycle. Works with end-users to obtain business requirements. Responsible for developing, testing, improving, and maintaining new and existing processes to help users retrieve data effectively. Collaborates with administrators and business users to provide technical support and identify new requirements. Responsibilities Responsibilities: Design stable, reliable and effective database processes. Solve database usage issues and malfunctions. Gather user requirements and identify new features. Provide data management support to users. Ensure all database programs meet company and performance requirements. Research and suggest new database products, services, and protocols. Requirements and skills In-depth understanding of data management (e.g. permissions, security, and monitoring) Excellent analytical and organization skills An ability to understand front-end user requirements and a problem-solving attitude Excellent verbal and written communication skills Assumes responsibility for related duties as required or assigned. Stays informed regarding current computer technologies and relational database management systems with related business trends and developments. Consults with respective IT management in analyzing business functions and management needs and seeks new and more effective solutions. Seeks out new systems and software that reduces processing time and/or provides better information availability and decision-making capability. Job Type: Full-time Pay: From $115,000- 128,000 yearly Expected hours: 40 per week Benefits: Dental insurance Health insurance Paid time off Vision insurance Paid time off (PTO) Various health insurance options & wellness plans Required Knowledge Considerable knowledge of on-line and design of computer applications. Require Experience One to three years of database development/administration experience. Skills/Abilities Strong creative and analytical thinking skills. Well organized with strong project management skills. Good interpersonal and supervisory abilities. Ability to train and provide aid others.
    $115k-128k yearly 60d+ ago
  • Data Engineer

    Total Quality Logistics, Inc. 4.0company rating

    Remote hadoop developer job

    Country USA State Ohio City Cincinnati Descriptions & requirements About the role: As a Data Engineer, with TQL you will be supporting the FP&A department by developing scalable reporting solutions in Microsoft Fabric. This role will focus on migrating data from on-premises systems to the cloud, building and optimizing SQL views and pipelines, and creating governed Power BI datasets and semantic models. What's in it for you: * $85,000-$125,000 base salary + performance bonuses * Advancement opportunities with aggressive and structured career paths * A culture of continuous education and technical training with reimbursements available * Hybrid work environment with the ability to work remotely 40 hours per month * Comprehensive benefits package * Health, dental and vision coverage * 401(k) with company match * Perks including employee discounts, financial wellness planning, tuition reimbursement and more * Certified Great Place to Work and voted a 2019-2026 Computerworld Best Places to Work in IT What you'll be doing: * Migrate FP&A datasets from on-premises to Microsoft Fabric/Lakehouse * Build and maintain SQL pipelines, transformations, and views that support reporting needs * Ensure performance, scalability, and reliability through automation, monitoring, and CI/CD best practices * Design, publish, and manage Power BI certified datasets, semantic models, and reports/dashboards * Apply best practices in DAX, modeling, and governance to enable accurate, self-service reporting * Partner with Finance stakeholders to translate reporting requirements into technical deliverables * Implement processes to ensure accuracy, consistency, and reconciliation across financial and operational systems * Maintain documentation of data models, business logic, and reporting standards * Troubleshoot and resolve issues impacting reporting accuracy or performance * Collaborate with Data Governance and Quality teams to align with enterprise standards and metadata frameworks What you need: * Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field * 3+ years of experience in BI/data engineering or analytics engineering * Advanced SQL skills with proven experience in building and optimizing large-scale datasets * Strong Power BI expertise (datasets, DAX, performance tuning, semantic models) * Hands-on experience with Microsoft Fabric and Lakehouse/cloud data platforms preferred * Knowledge of financial reporting concepts and ability to work with FP&A stakeholders p * Strong problem-solving skills and ability to bridge Finance and IT needs Where you'll be: 4289 Ivy Pointe Boulevard, Cincinnati, Ohio 45245 Employment visa sponsorship is unavailable for this position. Applicants requiring employment visa sponsorship now or in the future (e.g., F-1 STEM OPT, H-1B, TN, J1 etc.) will not be considered. About Us Total Quality Logistics (TQL) is one of the largest freight brokerage firms in the nation. TQL connects customers with truckload freight that needs to be moved with quality carriers who have the capacity to move it. As a company that operates 24/7/365, TQL manages work-life balance with sales support teams that assist with accounting, and after hours calls and specific needs. At TQL, the opportunities are endless which means that there is room for career advancement and the ability to write your own paycheck. What's your worth? Our open and transparent communication from management creates a successful work environment and custom career path for our employees. TQL is an industry-leader in the logistics industry with unlimited potential. Be a part of something big. Total Quality Logistics is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, age, national origin, genetic information, disability or protected veteran status. If you are unable to apply online due to a disability, contact recruiting at ****************** *
    $85k-125k yearly 5d ago
  • Data Engineer, Senior - Vital CDM (Vitalware)

    Health Catalyst 4.7company rating

    Remote hadoop developer job

    Join one of the nation's leading and most impactful health care performance improvement companies. Over the years, Health Catalyst has achieved and documented clinical, operational, and financial improvements for many of the nation's leading healthcare organizations. We are also increasingly serving international markets. Our mission is to be the catalyst for massive, measurable, data-informed healthcare improvement through: Data: integrate data in a flexible, open & scalable platform to power healthcare's digital transformation Analytics: deliver analytic applications & services that generate insight on how to measurably improve Expertise: provide clinical, financial & operational experts who enable & accelerate improvement Engagement: attract, develop and retain world-class team members by being a best place to work Data Engineer, Senior - Vital CDM (Vitalware) Department: Product Development Reports To: Manager, Data Engineering Employment Type: Full-time Location: Remote, US Position Overview The Senior Data Engineer supports the Product Development department and is responsible for working with a team of web application and data engineers to implement database solutions for long term scalability, reliability and performance in a multi-platform, SaaS environment, leveraging both RDBMS and NoSQL Solutions. The position requires cross-team communication, attention to detail and the ability to develop innovative technologies and approaches for building high availability data persistent systems. The Senior Data Engineer takes direction from the Manager, Data Engineering. This position includes helping scale and refactor an existing public facing website database and related services, moving resources to Azure, creating and releasing new features for the product, and managing related backend data services. This is a small dynamic team and a great opportunity to shape the future of a meaningful product. You will be expected to help support and expand the current product while thinking about scale and the future with Azure PaaS. What you'll own in this role: Implement features in collaboration with Product Managers and developers within Agile / Scrum methodology. Build solutions that are automated, scalable, and sustainable while minimizing defects and technical debt. Evaluate and analyze the current system architecture. Create scalable solutions to improve uptime and responsiveness while providing direction for future state. Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements and comply with all applicable standards. Research, identify, analyze and correct any technical issues receiving claim transactions and/or provider data. Resolve complex data issues and perform quality data checks Receive and understand business requirements and create data mapping specifications Integrate client's data into our product suite Maintain and optimize several complex databases. Investigate and troubleshoot complicated database applications and stability issues. Ensure MSSQL databases are operational and provide valid and relevant data. Guide our efforts in all areas of database design, performance, and reliability. Participate in code reviews that include database changes and effectively communicate issues and risks. Integrate new products and software packages and ensure data produced is accurate. Optimize code for maximum scalability and maintainability. Incorporate unit testing and regression testing to ensure defect-free builds and releases. What you'll bring to this role: BS or MS in Computer Science or equivalent professional experience. 6+ years MSSQL Server and/or RDBMS experience with current technology required. 6+ years SQL optimization experience required (Index optimization strategies, Data normalization/de-normalization strategies, Plan analysis, Recompilation, Caching and buffering, Optimization tools including SQL Server Extended Events or similar, Statistics and their role). 3+ years of experience with high transaction OLTP environment with 4+ TB in size. A solid understanding of data structures (e.g., XML/SGML/DTD/JSON). A solid understanding of parsing and transforming JSON data in SQL Server. Experience writing complex and efficient SQL stored procedures. Deep SQL Server working knowledge including order of operations, transactions and concurrency, file tables and security, brokering technologies, transactional replication, indexing strategies and maintenance, backup and recovery models, multi-node clustering and high availability. Familiar with Git and branching strategies. Familiar with creating and/or consuming REST APIs using C#, NodeJS, Python, etc. Familiar with NoSQL (MongoDB and/or Elasticsearch). Azure knowledge highly desired. Demonstrable experience implementing enterprise-scale, high volume, high availability systems. Demonstrated ability to deliver major critical projects. Experience with Agile and Scrum team development environments Skill and Ability Requirements Must be well organized, accurate and detail oriented. Excellent written and verbal communication with technical and non-technical staff. Ability to work in complex code bases written by others. Strong organizational, presentation, interpersonal and consultative skills a must. Ability to manage multiple projects/tasks simultaneously. Good judgment and decision-making skills. Enthusiastic about sharing knowledge and experience. Maintains a positive and results-oriented attitude. NOTE: This job description is not intended to be all-inclusive. Applicants may perform other related duties as negotiated to meet the ongoing needs of the organization. The above statements describe the general nature and level of work being performed in this job function. They are not intended to be an exhaustive list of all duties, and indeed additional responsibilities may be assigned by Health Catalyst. Studies show that candidates from underrepresented groups are less likely to apply for roles if they don't have 100% of the qualifications shown in the job posting. While each of our roles have core requirements, please thoughtfully consider your skills and experience and decide if you are interested in the position. If you feel you may be a good fit for the role, even if you don't meet all of the qualifications, we hope you will apply. If you feel you are lacking the core requirements for this position, we encourage you to continue exploring our careers page for other roles for which you may be a better fit. At Health Catalyst, we appreciate the opportunity to benefit from the diverse backgrounds and experiences of others. Because of our deep commitment to respect every individual, Health Catalyst is an equal opportunity employer.
    $92k-134k yearly est. Auto-Apply 19d ago
  • Data Engineer (Mid level)

    Thesis 4.0company rating

    Remote hadoop developer job

    We're looking for a mid-level Data Engineer, in Eastern timezone (United States) with experience in financial services, crypto, or blockchain data to join our engineering team. You'll help expand our in-house data capabilities and design pipelines that can handle the unique challenges of high-volume, high-integrity financial data. About Thesis Thesis* is a pioneering venture studio dedicated to building on Bitcoin since 2014. We seek, fund, and build products and protocols in cryptocurrency and decentralized businesses that enable personal empowerment. Our projects include Mezo, a Bitcoin finance app; Keep Network (now Threshold Network), a privacy protocol for public blockchains; Fold (NASDAQ:FLD), for earning Bitcoin on your purchases; Taho, a community-owned and operated cryptocurrency wallet; Lolli, an app providing Bitcoin rewards for purchases, gaming, and other online commerce; and Embody, a fully encrypted period tracking app. Thesis* continues to challenge traditional systems, driven by innovation and a belief in a sovereign digital future shaping the decentralized landscape one project at a time. To learn more, please visit: ****************** Investors in the company and our projects include Andreessen Horowitz, Pantera, Multicoin, Polychain Capital, and Draper Associates, among others. We are a remote-first company, led by founders who have been operating in the cryptocurrency and web3 space since 2014. About Mezo Mezo is Bitcoins' Economic Layer; a new home for Bitcoin holders to cultivate Bitcoin and grow wealth together. It is a Bitcoin-first chain designed for user ownership of assets, reliable bridging with tBTC, a dual staking model for rewards and validation, and much more. Mezo is proudly brought to you by Thesis, the same team behind tBTC, Fold, Acre, Etcher, Taho, Embody, and Defense. Thesis is a cryptocurrency venture studio whose mission is to empower the individual. We seek, fund, and build brands in cryptocurrency and decentralized businesses that enable personal empowerment. We're a fun, down-to-earth, fast-paced, highly collaborative, and fully remote team! Investors in Thesis and our projects include Andreessen Horowitz, Polychain Capital, Pantera Capital, and Draper Associates, among others. We are a remote-first company, led by founders who have been operating in the cryptocurrency and web3 space since 2014. About the Data Engineer At Mezo, we're building the Bitcoin bank - a financial system where people can bank on themselves . To get there, we need world-class data infrastructure powering everything from on-chain analytics and user insights to credit risk modeling and stablecoin liquidity. We're looking for a mid-level Data Engineer with experience in financial services, crypto, or blockchain data to join our engineering team. You'll be based in the United States (NYC) and you'll help expand our in-house data capabilities and design pipelines that can handle the unique challenges of high-volume, high-integrity financial data. What You'll Do Architect complex, real-time data pipelines: Design, develop, and optimize ETL pipelines that integrate large data sets from both off and on-chain sources. Ensure low-latency ingestion and processing of time-sensitive data Proactively optimize and constantly maintain processes Act as a key contributor to developing and supporting complex data architectures Continually troubleshoot and optimize data systems, identifying issues and resolving them Proactively improve processes and technologies for more efficient data processing and delivery Ensure data availability, reliability, and performance Ensure data integrity, consistency, and security across systems Collaborate with Data Science: Work with the Data Scientist to write and code review Python scripts for data ingestion, transformation, and automation Implement and manage data workflows using Cloud Composer and Github Actions for scheduling and orchestration based on Data Science specifications Build and maintain high-performance data warehouse schema with Google BigQuery and DBT for data transformation, mapped to the needs of the Data Scientist Work closely with on-chain data: Build data validation, reconciliation, and monitoring systems that meet the standards of both financial services and crypto-native ecosystems. Explore new approaches to indexing and querying Bitcoin and Ethereum data, as well as emerging L2s and DeFi protocols. Collaborate with cross-functional teams: Partner with product, engineering, and data science to deliver the datasets that drive lending models, stablecoin flows, and new product launches. Requirements 3-6 years in a data engineering role, with at least some experience in DeFi, fintech, or a related field Extensive experience with Python and SQL Experience with data warehousing solutions (Snowflake, BigQuery, Redshift) Strong understanding of Google Tag Manager (familiarity with Data Layer a plus) Expertise with orchestration tooling like Fivetran or Airflow, data transformation tools like dbt, and git/Github Comprehensive understanding of the Google Cloud Platform, including Cloud SQL, Cloud Functions, and BigQuery Familiarity with data governance and compliance standards Hands-on experience with blockchain or crypto data, including core tools like Dune or Goldsky Knowledge of standard ETL patterns, modern data warehousing ideas such as data mesh or data vaulting, and data quality practices Preferred Qualifications Knowledge of real-time data processing and event-driven tracking with analytics.js and/or Segment Familiarity with data observability tools and anomaly detection for production systems Understanding of financial data governance, reconciliation, and compliance needs. Experience with on-chain indexing, blockchain ETL, or real-time risk/credit models. Exposure to data visualization platforms such as Looker Studio, Hex, or Mixpanel Prior experience as a data analyst or scientist Location Remote in the U.S. - Eastern timezone Salary We offer competitive salaries, variable with experience and a number of other factors. Benefits At Thesis, we work in a fun, fast-paced environment that operates by collaborating both remotely and in person when we can. We offer a competitive salary, full health benefits, opportunity for equity and a number of other perks. Our Cultural Tenets We Believe in Freedom and Autonomy We Have Inquisitive Minds We Are Obsessed with Communication We Are Proudly Offbeat We Care About Each Other We Are Driven Equal Opportunity Statement Thesis is committed to building a diverse and inclusive team. We welcome applications from candidates of all backgrounds and do not discriminate based on race, religion, national origin, gender, sexual orientation, age, veteran status, or disability status.
    $95k-138k yearly est. Auto-Apply 60d+ ago
  • Data Engineer II

    Capital Rx 4.1company rating

    Remote hadoop developer job

    About Judi Health Judi Health is an enterprise health technology company providing a comprehensive suite of solutions for employers and health plans, including: Capital Rx, a public benefit corporation delivering full-service pharmacy benefit management (PBM) solutions to self-insured employers, Judi Health™, which offers full-service health benefit management solutions to employers, TPAs, and health plans, and Judi , the industry's leading proprietary Enterprise Health Platform (EHP), which consolidates all claim administration-related workflows in one scalable, secure platform. Together with our clients, we're rebuilding trust in healthcare in the U.S. and deploying the infrastructure we need for the care we deserve. To learn more, visit **************** Location: Remote (For Non-Local) or Hybrid (Local to NYC area or Denver, CO) Position Summary: We are seeking a highly motivated and talented Data Engineer to join our team and play a critical role in shaping the future of healthcare data management. This individual will be a key contributor in building robust, scalable, and accurate data systems that empower operational and analytics teams to make informed decisions and drive positive outcomes. Position Responsibilities: Lead relationship with operational and analytics teams to translate business needs into effective data solutions Architect and implement ETL workflows leveraging CapitalRx platforms and technologies such as Python, dbt, SQLAlchemy, Terraform, Airflow, Snowflake, and Redshift Conduct rigorous testing to ensure the flawless execution of data pipelines before production deployment Identify, recommend, and implement process improvement initiatives. Proactively identify and resolve data-related issues, ensuring system reliability and data integrity Lead moderately complex projects. Provide ongoing maintenance and support for critical data infrastructure, including 24x7 on-call availability Responsible for adherence to the Capital Rx Code of Conduct including reporting of noncompliance. Required Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field 2+ experience working with Airflow, dbt, and Snowflake Expertise in data warehousing architecture techniques and familiarity with Kimball methodology Minimum 3+ years experience with a proven track record as a Data Engineer, displaying the ability to design, implement, and maintain complex data pipelines 1+ year experience in Python, SQL Capacity to analyze the company's broader data landscape and architect scalable data solutions that support growth Excellent communication skills to collaborate effectively with both technical and non-technical stakeholders A self-motivated and detail-oriented individual with the ability to tackle and solve intricate technical challenges Preferred Qualifications: 1-3 years of experience as a Data Engineer, ideally in the healthcare or PBM sector Advanced proficiency with Airflow, dbt, and Snowflake, coupled with 3+ years of SQL development and Python experience This range represents the low and high end of the anticipated base salary range for the NY-based position. The actual base salary will depend on several factors such as experience, knowledge, and skills, and if the location of the job changes. This position description is designed to be flexible, allowing management the opportunity to assign or reassign duties and responsibilities as needed to best meet organizational goals. Salary Range$120,000-$140,000 USD All employees are responsible for adherence to the Capital Rx Code of Conduct including the reporting of non-compliance. This position description is designed to be flexible, allowing management the opportunity to assign or reassign duties and responsibilities as needed to best meet organizational goals. Judi Health values a diverse workplace and celebrates the diversity that each employee brings to the table. We are proud to provide equal employment opportunities to all employees and applicants for employment and prohibit discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, medical condition, genetic information, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. By submitting an application, you agree to the retention of your personal data for consideration for a future position at Judi Health. More details about Judi Health's privacy practices can be found at *********************************************
    $120k-140k yearly Auto-Apply 5d ago
  • Data Engineer II, Business Intelligence

    Resurgent Capital Services 4.4company rating

    Remote hadoop developer job

    About Us Fueled by a fundamental belief in innovation, Resurgent Capital Services is an industry-leading financial services company in our sector. It all began 25 years ago when a small group of successful entrepreneurs had a vision for a new type of asset receivables company. One with a commitment to superior service and a personal touch with every interaction. We believe that demonstrating integrity in everything we do, maintaining a strong commitment to compliance, and doing things the right way is a sustainable business model. We want you to feel like your work has an impact and makes a difference every day. Join us as we develop strategies for change and transform the trajectory of your career! Notice for California Residents - California Privacy Policy About Us Fueled by a fundamental belief in innovation, Resurgent Capital Services is an industry-leading financial services company in our sector. It all began 25 years ago when a small group of successful entrepreneurs had a vision for a new type of asset receivables company. One with a commitment to superior service and a personal touch with every interaction. We believe that demonstrating integrity in everything we do, maintaining a strong commitment to compliance, and doing things the right way is a sustainable business model. We want you to feel like your work has an impact and makes a difference every day. Join us as we develop strategies for change and transform the trajectory of your career! Notice for California Residents - California Privacy Policy Position: Data Engineer II, Business Intelligence Roles & Responsibilities: Responsible for the design and development of data structures and processes to support the operational and analytical functions of our Performing Acquisition group. Acquire cleanse, load and analyze business intelligence data used to evaluate Consumer Loans. Generate periodic technical reports based on analysis of business needs and company financial data. Create and maintain custom business intelligence tools, databases, dashboards and systems used to support the analysis of Consumer Loans. Follow best practices and coding standards. Research emerging development technologies, products and business intelligence processes, applying advanced knowledge of IT concepts and practices. Work at the direction of the Spot 18 Executive and Acquisition team. Responsible for all aspects of the software development lifecycle, including: design, coding, integration testing, deployment and documentation. Work closely with Business Intelligence team to tune data sets and queries involving large data sets. This position is 100% remote. Schedule: 40 hours per week, Monday through Friday Location: Resurgent Capital Services, L.P., 55 Beattie Place, Suite 110, Greenville, SC 29601 Skills & Qualifications: Bachelor's degree in Information Systems, Computer Science, Econometrics or related field and 4 years of experience as a Systems Engineer, Data Analyst (all levels), or related role where experience was gained. Special Skills: Also requires experience in the following: * Four (4) years of direct experience with Transact-SQL using Microsoft SQL Server Database. * Four (4) years of direct experience with Python, Powershell or C#. * Implement data structures, analytics, and operational reporting. * Create data models, and creating data pipelines for acquisition, cleansing, and integration. * Demonstrated experience in IT concepts and practices. * Understands and has used software development lifecycles. * Experience with organizational and project management. Contact: To apply, email resume to Crystal Drummond at ***********************. Please reference job title and location. Resurgent is an Equal Opportunity employer that is fueled by our diverse and inclusive work environment. Are you excited about this opportunity, but your skills and experience aren't an exact match? We encourage you to apply anyway! You may be just the person we are searching for to fill this or another position. We would love to consider you for the Resurgent team! All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, protected veteran status, gender identity or any other factor protected by applicable federal, state, or local laws Resurgent is an Equal Opportunity employer that is fueled by our diverse and inclusive work environment. Are you excited about this opportunity, but your skills and experience aren't an exact match? We encourage you to apply anyway! You may be just the person we are searching for to fill this or another position. We would love to consider you for the Resurgent team! All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, protected veteran status, gender identity or any other factor protected by applicable federal, state, or local laws.
    $88k-123k yearly est. Auto-Apply 9d ago
  • Database Developer 1 (Remote)

    Apidel Technologies 4.1company rating

    Remote hadoop developer job

    Prepares, defines, structures, develops, implements, and maintains database objects. Analyze query performance, identify bottlenecks, and implement optimization techniques. Defines and implements interfaces to ensure that various applications and user-installed or vendor-developed systems interact with the required database systems. Creates database structures, writing and testing SQL queries, and optimizing database performance. Plans and develops test data to validate new or modified database applications. Work with business analysts, and other stakeholders to understand requirements and integrate database solutions. Build and implement database systems that meet specific business requirements ensuring data integrity and security, as well as troubleshooting and resolving database issues. Design and implement ETL pipelines to integrate data from various sources using SSIS. Responsible for various SQL jobs. Skills Required Strong understanding of SQL and DBMS like MySQL, PostgreSQL, or Oracle. Ability to design and model relational databases effectively. Skills in writing and optimizing SQL queries for performance. Ability to troubleshoot and resolve database-related issues. Ability to communicate technical information clearly and concisely to both technical and non-technical audiences. Ability to collaborate effectively with other developers and stakeholders. Strong ETL experience specifically with SSIS. Skills Preferred Azure experience is a plus .Net experience is a plus GITHub experience is a plus Experience Required 2 years of progressively responsible programming experience or an equivalent combination of training and experience. Education Required Bachelor`s degree in Information Technology or Computer Science or equivalent experience
    $94k-121k yearly est. 4d ago
  • Azure Cloud Data Engineer

    Arc Group 4.3company rating

    Remote hadoop developer job

    Job DescriptionAZURE DATA ENGINEER (REMOTE) ARC Group has an immediate opportunity for an aspiring Azure / Cloud Data Engineer to join our global client's growing team of data experts. This is 100 REMOTE and a full-time direct hire and requires you to currently hold permanent work authorization (no C2C). The Azure Data Engineer is 100% remote and working in the eastern time zone during core business hours. You will be working with Data Engineering and Data Analytics technologies on the Azure Cloud platform. This is a fantastic opportunity to join a well-respected global leader in the supply chain / logistics space and allows for tremendous career growth potential. The company provides competitive salary, bonus, benefits, and room to grow. AZURE DATA ENGINEER SUMMARY You will be responsible for expanding and optimizing our Azure Data Lake, as well as optimizing the data flow and collection for cross-functional teams. Building data pipelines as well as processing (transforming, aggregating, wrangling) data. Optimizing data systems and building them from the ground up. In your role, you will work with our software developers, database architects, data analysts, and data scientists on data initiatives and will ensure that the optimal data delivery is consistent throughout ongoing projects. In addition to the development of data solutions, you are accountable for the customer's Azure support experience driving resolution of complex critical problems from problem identification to full resolution, you will own and manage the customer/user experience and supporting key customer projects on Azure. Your day-to-day job will be about providing both technical expertise and about being an excellent communicator and a service oriented professional. You will debug, troubleshoot, correct code, and help resolve business user's issues. REQUIREMENTS/SKILLS: Bachelor's degree in Computer Science, Information Systems, Engineering, Mathematics 6+ years of experience of IT platform implementation in a highly technical and analytical role. 2 to 4 years of recent professional experience with Data Management technologies 2 years Azure Data Factory one more of the following, Azure Synapse (DWH), Azure SQL Database, Azure Data Bricks, Spark, and Data Explorer. 2 to 4 years of designing Big Data Pipelines supporting Data Science Use Cases. Practical experience in Python and at least 1 of the following programming languages: Scala, Java, Kotlin, Go, Rust, C#, F#, C, C++. Experience building data ingestion pipelines for a cloud, with data consumable by business intelligence solutions (e. g. PowerBI) and/or data scientists. Familiarity with the Azure cloud ecosystem, specifically with data intensive components like Azure Synapse, CosmosDB, Data Factory, Databricks. Robust knowledge of cloud storage aspects (ADLS gen2) Expertise in cloud security aspects Routine command of observability approaches for the clouds: logging, monitoring, tracing. Would you like to know more about our new opportunity? For immediate consideration, please apply online while viewing all open jobs at ******************* ARC Group is a Forbes-ranked a top 20 recruiting and executive search firm working with clients nationwide to recruit the highest quality technical resources. We have achieved this by understanding both our candidate's and client's needs and goals and serving both with integrity and a shared desire to succeed. ARC Group is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse workforce.
    $80k-115k yearly est. 2d ago
  • Backend Developer - Database - USA(Remote)

    Photon Group 4.3company rating

    Remote hadoop developer job

    Greetings Everyone Who are we? For the past 20 years, we have powered many Digital Experiences for the Fortune 500. Since 1999, we have grown from a few people to more than 4000 team members across the globe that are engaged in various Digital Modernization. For a brief 1 minute video about us, you can check ***************************** What will you do? What are we looking for? Requirement is for a DB/BE Candidate with strong in SQL and PL/SQL skills Position Summary We are seeking a highly skilled backend-focused Staff Software Engineer to join our team. The ideal candidate will have extensive experience in backend development, system design, and a strong understanding of cloud-native software engineering principles. Responsibilities Develop backend services using Java and Spring Boot Design and implement solutions deployed on Google Cloud Platform (GKE) Work with distributed systems, including Google Cloud Spanner (Postgres dialect) and Confluent Kafka (or similar pub/sub tools) Design, optimize, and troubleshoot complex SQL queries and stored procedures (e.g., PL/SQL) to support high-performance data operations and ensure data integrity across applications. Collaborate with teams to implement CI/CD pipelines using GitHub Actions and Argo CD Ensure high performance and reliability through sound software engineering practices Mentor and provide technical leadership to the frontend engineering team Required Qualifications 7+ years' experience in software engineering from ideation to production deployment of IT solutions 5+ years' experience in full software development life cycle including ideation, coding, coding standards, testing, code reviews and production deployments 5+ years of experience with backend Java , Spring Boot and Microservices 3+ years of hands-on experience with a public cloud provider 3+ years working with pub/sub tools like Kafka or similar 3+ years of experience with database design/development (Postgres or similar) 2+ years of experience with CI/CD tools (GitHub Actions, Jenkins, Argo CD, or similar) Preferred Qualifications Demonstrated experience with development and deployment of Minimum Viable Products (MVPs) Must demonstrate innovative mindset, divergent thinking, and convergent actions. Familiarity with Kubernetes concepts; experience deploying services on GKE is a plus Compensation, Benefits and Duration Minimum Compensation: USD 44,000 Maximum Compensation: USD 154,000 Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role. Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees. This position is available for independent contractors No applications will be considered if received more than 120 days after the date of this post
    $80k-109k yearly est. Auto-Apply 13d ago
  • Senior Data Engineer

    Corel Corporation Usa 4.4company rating

    Remote hadoop developer job

    Push the boundaries of tech. In your sweatpants. We're looking for an experienced Senior Data Engineer to help us change how the world works. Here, you'll be part of our Data Engineering & Analytics team, supporting cross-functional groups around the world. The right candidate will develop, review, and maintain data infrastructure and various data flows. You will also develop means to ensure continuous data validation and telemetry for all data processes. The top creative and technical minds could work anywhere. So why are so many of them choosing Corel? Here are three reasons: This is the moment. It's an exciting time at Corel, with new leadership, a refreshed brand, and a whole new approach to changing the way the world works. We're at the forefront of a movement, and we want you to ride this wave with us. We want you to be you . Too often, companies tell you about their culture and then expect you to fit it. Our culture is built from the people who work here. We want you to feel safe to be who you are, take risks, and show us what you've got. It's your world. We know you have a life. We want to be part of it, but not all of it. At Corel, we're serious about empowering people to work when, how, and where they want. Couch? Sweatpants? Cool with us. We believe that happy employees mean happy customers. That's why we hire amazing people and get out of their way. Sound good so far? Awesome. Let's talk more about the Senior Data Engineer role and see if we're destined to be together. As a Senior Data Engineer, you will: Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines for Data Lake and Data Warehouse Build and implement ETL frameworks to improve code quality and reliability Build and enforce common design patterns to increase code maintainability Ensure accuracy and consistency of data processing, results, and reporting Design cloud-native data pipelines, automation routines, and database schemas that can be leveraged to do predictive and prescriptive machine learning Communicate ideas clearly, both verbally and through concise documentation, to various business sponsors, business analysts and technical resources Guide and mentor other Data Engineers as a technical owner of parts of the data platform What YOU bring to the team: Expert knowledge of Python Expert knowledge of SQL Experience with DevOps mode of work 7+ years of professional experience 5+ years of experience working in data engineering, business intelligence, or a similar role 5+ years of experience in ETL orchestration and workflow management tools like Airflow, flink, etc. using AWS/GCP 3+ years of experience with the distributed data processing tools like Spark, Presto, etc. and streaming technologies such as Kafka/Flink 3+ years of experience with SnowFlake (preferred) or another big data database platform 3+ years of experience with cloud service providers: Amazon AWS (preferred), or one of the other major clouds Expertise with containerization orchestration engines (Kubernetes) MS in Computer Science, Software Engineering, or relevant field preferred, BS in one of same fields acceptable Our Team: Corel features award-winning solutions that have helped millions of users for 40 years (can you believe it?!). We offer a fully remote workspace, and we mean it. There is no pressure to work in an office whatsoever. Hours are flexible! You've worked hard to build your life, and we don't want you to give it up for work. Our team is growing fast, and there's a ton of energy and a lot of really smart, motivated, fun people ready to welcome you in. What are you waiting for? Apply now! We can't wait to meet you. (FYI, we're lucky to have a lot of interest and we so appreciate your application, though please note that we'll only contact you if you've been selected for an interview.) About Corel Corel is a beloved and trusted industry titan fueled by make-everything-easier flexibility. With a 40-year legacy of innovation, we understand where you've been, and we're uniquely equipped to get you where you want to be. Our comprehensive collection of creative, collaborative, and productivity solutions propel your teams on their journey. From meeting your deadlines to realizing your dreams, Corel empowers all you do. Our products enable millions of connected knowledge workers around the world to do great work faster. Our success is driven by an unwavering commitment to deliver a broad portfolio of innovative applications - including CorelDRAW , MindManager , Parallels , and WinZip - to inspire users and help them achieve their goals. It is our policy and practice to offer equal employment opportunities to all qualified applicants and employees without regard to race, color, age, religion, national origin, sex, political affiliation, sexual orientation, marital status, disability, veteran status, genetics, or any other protected characteristic. Corel is committed to an inclusive, barrier-free recruitment and selection process and work environment. If you are contacted for a job opportunity, please advise us of any accommodation that are required. Appropriate accommodation will be provided upon request as required by Federal and Provincial regulations and Company Policy. Any information received relating to accommodation will be treated as confidential. #LI-Remote
    $84k-121k yearly est. Auto-Apply 60d ago
  • PostgreSQL Database Developer

    Contact Government Services, LLC

    Remote hadoop developer job

    PostgreSQL Database DeveloperEmployment Type: Full Time, Experienced level Department: Information Technology CGS is seeking a PostgreSQL Database Developer to join our team supporting a rapidly growing Data Analytics and Business Intelligence platform focused on providing data solutions that empower our federal customers. You will support a migration from the current Oracle database to a Postgres database and manage the database environments proactively. As we continue our growth, you will play a key role in ensuring scalability of our data systems. CGS brings motivated, highly skilled, and creative people together to solve the government's most dynamic problems with cutting-edge technology. To carry out our mission, we are seeking candidates who are excited to contribute to government innovation, appreciate collaboration, and can anticipate the needs of others. Here at CGS, we offer an environment in which our employees feel supported, and we encourage professional growth through various learning opportunities. Skills and attributes for success:- Drive efforts to migrate from the current Oracle database to the new Microsoft Azure Postgres database- Create and maintain technical documentation, using defined technical documentation templates, as well as gain an in-depth knowledge of the business data to propose and implement effective solutions- Collaborate with internal and external parties to transform high-level technical objectives into comprehensive technical requirements- Ensure the availability and performance of the databases that support our systems, ensuring that they have sufficient resources allocated to support high resilience and speed.- Perform and assist developers in performance tuning- Proactively monitor the database systems to ensure secure services with minimum downtime and improve maintenance of the databases to include rollouts, patching, and upgrades- Create and maintain technical documentation using defined technical documentation templates, as well as gaining an in-depth knowledge of the business data to propose and implement effective solutions- Work within a structured and Agile development approach Qualifications:- Bachelor's degree- Must be US Citizenship- 7 years of experience with administrating PostgreSQL Databases in Linux environments- Experience with setting up, monitoring, and maintaining PostgreSQL instances- Experience with implementing and maintaining PostgreSQL backup and disaster recovery processes- Experience migrating Oracle schema, packages, views, triggers to Postgres using Ora2Pg tool Ideally, you will also have:- Experience implementing and maintaining data warehouses- Experience with AWS RDS for PostgreSQL- Experience with Oracle databases- Experience leveraging the Ora2Pg tool- Experience with working in cloud environments such as Azure and/or AWS- Prior federal consulting experience Our Commitment:Contact Government Services (CGS) strives to simplify and enhance government bureaucracy through the optimization of human, technical, and financial resources. We combine cutting-edge technology with world-class personnel to deliver customized solutions that fit our client's specific needs. We are committed to solving the most challenging and dynamic problems. For the past seven years, we've been growing our government-contracting portfolio, and along the way, we've created valuable partnerships by demonstrating a commitment to honesty, professionalism, and quality work. Here at CGS we value honesty through hard work and self-awareness, professionalism in all we do, and to deliver the best quality to our consumers mending those relations for years to come. We care about our employees. Therefore, we offer a comprehensive benefits package.- Health, Dental, and Vision- Life Insurance- 401k- Flexible Spending Account (Health, Dependent Care, and Commuter)- Paid Time Off and Observance of State/Federal Holidays Contact Government Services, LLC is an Equal Opportunity Employer. Applicants will be considered without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Join our team and become part of government innovation! Explore additional job opportunities with CGS on our Job Board:************************************* For more information about CGS please visit: ************************** or contact:Email: ******************* #CJ
    $78k-104k yearly est. Auto-Apply 60d+ ago
  • Principal Data Engineer - ML Platforms

    Altarum 4.5company rating

    Remote hadoop developer job

    Altarum | Data & AI Center of Excellence (CoE) Altarum is building the future of data and AI infrastructure for public health - and we're looking for a Principal Data Engineer - ML Platforms to help lead the way. In this cornerstone role, you will design, build, and operationalize the modern data and ML platform capabilities that power analytics, evaluation, AI modeling, and interoperability across all Altarum divisions. If you want to architect impactful systems, enable data science at scale, and help ensure public health and Medicaid programs operate with secure, explainable, and trustworthy AI - this role is for you. What You'll Work On This role blends deep engineering with applied ML enablement: ML Platform Engineering: modern lakehouse architecture, pipelines, MLOps lifecycle Applied ML enablement: risk scoring, forecasting, Medicaid analytics NLP/Generative AI support: RAG, vectorization, health communications Causal ML operationalization: evaluation modeling workflows Responsible/Trusted AI engineering: model cards, fairness, compliance Your work ensures that Altarum's public health and Medicaid programs run on secure, scalable, reusable, and explainable data and AI infrastructure. What You'll Do Platform Architecture & Delivery Design and operate modern, cloud-agnostic lakehouse architecture using object storage, SQL/ELT engines, and dbt. Build CI/CD pipelines for data, dbt, and model delivery (GitHub Actions, GitLab, Azure DevOps). Implement MLOps systems: MLflow (or equivalent), feature stores, model registry, drift detection, automated testing. Engineer solutions in AWS and AWS GovCloud today, with portability to Azure Gov or GCP. Use Infrastructure-as-Code (Terraform, CloudFormation, Bicep) to automate secure deployments. Pipelines & Interoperability Build scalable ingestion and normalization pipelines for healthcare and public health datasets, including: FHIR R4 / US Core (strongly preferred) HL7 v2 (strongly preferred) Medicaid/Medicare claims & encounters (strongly preferred) SDOH & geospatial data (preferred) Survey, mixed-methods, and qualitative data Create reusable connectors, dbt packages, and data contracts for cross-division use. Publish clean, conformed, metrics-ready tables for Analytics Engineering and BI teams. Support Population Health in turning evaluation and statistical models into pipelines. Data Quality, Reliability & Cost Management Define SLOs and alerting; instrument lineage & metadata; ensure ≥95% of data tests pass. Perform performance and cost tuning (partitioning, storage tiers, autoscaling) with guardrails and dashboards. Applied ML Enablement Build production-grade pipelines for risk prediction, forecasting, cost/utilization models, and burden estimation. Develop ML-ready feature engineering workflows and support time-series/outbreak detection models. Integrate ML assets into standardized deployment workflows. Generative AI Enablement Build ingestion and vectorization pipelines for surveys, interviews, and unstructured text. Support RAG systems for synthesis, evaluation, and public health guidance. Enable Palladian Partners with secure, controlled-generation environments. Causal ML & Evaluation Engineering Translate R/Stata/SAS evaluation code into reusable pipelines. Build templates for causal inference workflows (DID, AIPW, CEM, synthetic controls). Support operationalization of ARA's applied research methods at scale. Responsible AI, Security & Compliance Implement Model Card Protocol (MCP) and fairness/explainability tooling (SHAP, LIME). Ensure compliance with HIPAA, 42 CFR Part 2, IRB/DUA constraints, and NIST AI RMF standards. Enforce privacy-by-design: tokenization, encryption, least-privilege IAM, and VPC isolation. Reuse, Shared-Services, and Enablement Develop runbooks, architecture diagrams, repo templates, and accelerator code. Pair with data scientists, analysts, and SMEs to build organizational capability. Provide technical guidance for proposals and client engagements. Your First 90 Days - You will make a meaningful impact fast. Expected outcomes include: Platform skeleton operational: repo templates, CI/CD, dbt project, MLflow registry, tests. Two pipelines in production (e.g., FHIR → analytics and claims normalization). One end-to-end CoE lighthouse MVP delivered (ingestion → model → metrics → BI). Completed playbooks for GovCloud deployment, identity/secrets, rollback, and cost control. Success Metrics (KPIs) Pipeline reliability meeting SLA/SLO targets. ≥95% data tests passing across pipelines. MVP dataset onboarding ≤ 4 weeks. Reuse of platform assets across ≥3 divisions. Cost optimization and budget adherence. What You'll Bring 7-10+ years in data engineering, ML platform engineering, or cloud data architecture. Expert in Python, SQL, dbt, and orchestration tools (Airflow, Glue, Step Functions). Deep experience with AWS + AWS GovCloud. CI/CD and IaC experience (Terraform, CloudFormation). Familiarity with MLOps tools (MLflow, Sagemaker, Azure ML, Vertex AI). Ability to operate in regulated environments (HIPAA, 42 CFR Part 2, IRB). Preferred: Experience with FHIR, HL7, Medicaid/Medicare claims, and/or SDOH datasets. Databricks, Snowflake, Redshift, Synapse. Event streaming (Kafka, Kinesis, Event Hubs). Feature store experience. Observability tooling (Grafana, Prometheus, OpenTelemetry). Experience optimizing BI datasets for Power BI. Logistical Requirements At this time, we will only accept candidates who are presently eligible to work in the United States and will not require sponsorship. Our organization requires that all work, for the duration of your employment, must be completed in the continental U.S. unless required by contract. If you're near one of our offices (Arlington, VA; Silver Spring, MD; or Novi, MI), you'll join us in person one day every other month (6 times per year) for a fun, purpose-driven Collaboration Day. These days are filled with creative energy, meaningful connection, and team brainstorming! Must be able to work during Eastern Time unless approved by your manager. Employees working remotely must have a dedicated, ergonomically appropriate workspace free from distractions with a mobile device that allows for productive and efficient conduct of business. Altarum is a nonprofit organization focused on improving the health of individuals with fewer financial resources and populations disenfranchised by the health care system. We work primarily on behalf of federal and state governments to design and implement solutions that achieve measurable results. We combine our expertise in public health and health care delivery with technology development and implementation, practice transformation, training and technical assistance, quality improvement, data analytics, and applied research and evaluation. Our innovative solutions and proven processes lead to better value and health for all. Altarum is an equal opportunity employer that provides employment and opportunities to all qualified employees and applicants without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, or any other characteristic protected by applicable law.
    $72k-98k yearly est. Auto-Apply 20d ago
  • Senior Data Engineer

    Lower LLC 4.1company rating

    Hadoop developer job in Columbus, OH

    Here at Lower, we believe homeownership is the key to building wealth, and we're making it easier and more accessible than ever. As a mission-driven fintech, we simplify the home-buying process through cutting-edge technology and a seamless customer experience. With tens of billions in funded home loans and top ratings on Trustpilot (4.8), Google (4.9), and Zillow (4.9), we're a leader in the industry. But what truly sets us apart? Our people. Join us and be part of something bigger. Job Description: We are seeking a Senior Data Engineer to play a key role in building and optimizing our data infrastructure to support business insights and decision-making. In this role, you will design and enhance denormalized analytics tables in Snowflake, build scalable ETL pipelines, and ensure data from diverse sources is transformed into accurate, reliable, and accessible formats. You will collaborate with business and sales stakeholders to gather requirements, partner with developers to ensure critical data is captured at the application level, and optimize existing frameworks for performance and integrity. This role also includes creating robust testing frameworks and documentation to ensure quality and consistency across data pipelines. What you'll do: Data Pipeline Engineering: Design, develop, and optimize high-performance ETL/ELT pipelines using Python, dbt, and Snowflake. Build and manage real-time ingestion pipelines leveraging AWS Lambda and CDC systems. Cloud & Infrastructure: Develop scalable serverless solutions with AWS, adopting event-driven architecture patterns. Manage containerized applications using Docker and infrastructure as code via GitHub Actions. Advanced Data Management: Create sophisticated, multi-layered Snowflake data models optimized for scalability, flexibility, and performance. Integrate and manage APIs for Salesforce, Braze, and various financial systems, emphasizing robust error handling and reliability. Quality Assurance & Operations: Implement robust testing frameworks, data lineage tracking, monitoring, and alerting. Enhance and manage CI/CD pipelines, drive migration to modern orchestration tools (e.g., Dagster, Airflow), and manage multi-environment deployments. Who you are: 5+ years of data engineering experience, ideally with cloud-native architectures. Expert-level Python skills, particularly with pandas, SQLAlchemy, and asynchronous processing. Advanced SQL and Snowflake expertise, including stored procedures, external stages, performance tuning, and complex query optimization. Strong proficiency with dbt, including macro development, testing, and automated deployments. Production-grade Pipeline Experience specifically with Lambda, S3, API Gateway, and IAM. Proven experience with REST APIs, authentication patterns, and handling complex data integrations. Preferred Experience Background in financial services or fintech, particularly loan processing, customer onboarding, or compliance. Experience with real-time streaming platforms like Kafka or Kinesis. Familiarity with Infrastructure as Code tools (Terraform, CloudFormation). Knowledge of BI and data visualization tools (Tableau, Looker, Domo). Container orchestration experience (ECS, Kubernetes). Understanding of data lake architectures and Delta Lake. Technical Skills Programming: Python (expert), SQL (expert), Bash scripting. Cloud: AWS (Lambda, S3, API Gateway, CloudWatch, IAM). Data Warehouse: Snowflake, dimensional modeling, query optimization. ETL/ELT: dbt, pandas, custom Python workflows. DevOps: GitHub Actions, Docker, automated testing. APIs: REST integration, authentication, error handling. Data Formats: JSON, CSV, Parquet, Avro. Version Control: Git, GitHub workflows. What Sets You Apart Systems Thinking: You see the big picture, designing data flows that scale and adapt with the business. Problem Solver: You quickly diagnose and resolve complex data issues across diverse systems and APIs. Quality Advocate: You write comprehensive tests, enforce data quality standards, and proactively prevent data issues. Collaborative: You thrive working alongside analysts, developers, and product teams, ensuring seamless integration and teamwork. Continuous Learner: You actively seek emerging data technologies and best practices to drive innovation. Business Impact: You understand how your data engineering decisions directly influence and drive business outcomes. Benefits & Perks Competitive salary and comprehensive benefits (healthcare, dental, vision, 401k match) Hybrid work environment (primarily remote, with two days a week in downtown Columbus Ohio Professional growth opportunities and internal promotion pathways Collaborative, mission-driven culture recognized as a local and national "best place to work" If you don't think you meet all of the criteria below but still are interested in the job, please apply. Nobody checks every box, and we're looking for someone excited to join the team. Lower provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. Privacy Policy
    $72k-92k yearly est. Auto-Apply 60d+ ago
  • Senior Data Engineer

    City Year 4.2company rating

    Remote hadoop developer job

    Application Instructions Click Apply to submit your online application. Please attach a resume and thoughtful cover letter on the "My Experience" page in the "Resume/CV" field. Active City Year Staff members must login to Workday to apply internally. Number of Positions: 1Work Location: 100% Remote Position Overview The Senior Data Engineer works closely with local City Year data experts, school district IT professionals and a multitude of partners to manage ingestion of data from many sources. Our ideal candidate is professionally experienced in all things Azure and familiar with dev ops. They will use their Azure experience, especially with Azure Data Factories and Databricks, to lead the end-to-end development and implementation of modern ETL/ELT data pipelines for our team. This candidate will be excited to promote the effective use of timely and accurate k-12 education data and empower front-line practitioners with the information needed to have greater impact on the students we serve. The Senior Data Engineer reports to the Director of Data Management and Reporting. Job Description As a Senior Data Engineer at City Year, you will: Be our resident Azure expert and trusted advisor for Azure Own the design, development, and implementation of modern data pipelines, data factories and data streams. This is a hands-on role. Lead the planning and then implement the data platform services including sizing, configuration, and needs assessment Own the management and development of various third-party data integrations Lead development of frameworks, and data architectures for large-scale data processing that ensure timely and accurate processing of data into and among City Year's systems and help implement them. Influence and make recommendations on the technical direction for the team by leveraging your prior experiences and your knowledge of emerging technologies and approaches Lead our team in identifying and promoting data management best practices Conduct full technical discovery, identify pain points, gather business and technical requirements, and explain the “as is” and “to be” scenarios to fully understand user stories from stakeholders and users Lead and participate in deep architectural discussions to build confidence and ensure customer success when building new solutions and migrating existing data applications on the Azure platform Serve as a leader in bridging the gap between technical and non-technical staff in understanding our data and its processing at City Year. Own the implementation and use of ETL and data management best practices, devops, ci/cd, and deployment strategies to ensure product quality, agility, robustness, and recoverability Partner with districts and internal customers to understand their requirements and implement data solutions. Create integrations between internal systems and support them operationally. Teach others in the DMAR team about Azure, databricks and best practices to ensure all services and technology implemented can be supported by others on the team. You must have: At least 3+ years of professional experience (not simply coursework or capstone projects) in the following: Working in an Azure environment on ETL/ELT projects Azure DevOps Azure Databricks Azure Data Factories SQL Python Relational databases Working with heterogeneous datasets/formats from many vendors, providers, and platforms Data Architecture experience Have experience and demonstrated ability to successfully integrate and manage data across disparate information systems Excellent written and verbal communication skills, especially the ability to express complex technical information in understandable and rigorously accurate terms as well as to express technical items in a manner that non-technical users can understand Attention to detail while working on multiple projects at once Demonstrated success and effectiveness working in and promoting a rapidly changing, collaborative, and time-critical work environment Nice to have: Some experience with Databricks Lakehouse with Microsoft Fabric with archiving and backup solutions in Microsoft Azure in a role at a school district, Charter Management Organization (CMO), or Ed-tech company with an emphasis on supporting end-user data needs in an Agile environment Commitment to continuous improvement and City Year's mission: our desire to focus talents on helping improve outcomes for kids in school and our AmeriCorps members who support them You love learning new things. You're curious and ask good questions. You solicit feedback from others, accept it with grace, and act on it What we offer: Your technical skills will be used to have a significant, positive impact on children's education outcomes and future opportunities A role on a small, high-powered team that is integral to delivering on the mission of City Year Opportunity for control over the design and implementation of solutions Space and opportunity to develop new technical skills Focus on creating a work environment that is diverse, inclusive, equitable, and encourages belonging Opportunity to work with some amazing people! Benefits Full-time employees will be eligible for all benefits including vacation, sick days and organization holidays. You may participate in all benefit programs that City Year establishes and makes available to eligible employees, under (and subject to all provisions of) the plan documents that govern those programs. Currently, City Year offers medical, dental, vision, life, accidental death and dismemberment and disability coverage, Flexible Spending Accounts (FSA), and other benefits including 401(k) plan(s) pursuant to the terms and conditions of company policy and the 401(k) plan document. For more information, click here. Employment at City Year is at-will. City Year does not sponsor work authorization visas.
    $74k-95k yearly est. Auto-Apply 60d+ ago
  • Big Data Engineer

    Hexaware Technologies 4.2company rating

    Remote hadoop developer job

    Job Description: 1. 3-5 years in Data platform engineering 2. Experience with CI/CD, laC(Terraform) and containerization with Docker/Kubernetes 3. Hands on experience building backend applications like APIs, Services etc 4. Proven track record in building scalable Data Engineering pipelines using Python, SQL, DBT Core/Cloud. 5. Experience working with MWAA (Airflow) or similar cloud based Data Engineering Orchestration Tool 6. Experience working with cloud ecosystems like AWS, Azure or GCP and modern data tools like Snowflake, Databricks etc. 7. Strong problem solving skills as well as ability to move in a fast pace environment is a plus.
    $78k-103k yearly est. Auto-Apply 26d ago
  • Senior Auto Technician - ASE CERTIFIED - Westerville - Schrock Rd

    Boyd's Tire & Service 4.4company rating

    Hadoop developer job in Westerville, OH

    Boyds Tire & Service has been providing drivers in Central Ohio with the best automotive products and services since 1996. We strive to provide you with high-quality tires and reliable car repairs at our several locations in Columbus, Blacklick, Hilliard, Lewis Center, Marysville, and Central Ohio. Our staff is ready to go above and beyond to help you meet your needs, to get you back on the road, satisfied. The Automotive Technician is responsible for effectively and efficiently diagnosing and repairing customer vehicles while adhering to the MAP guidelines and in accordance with dealership, manufacturers and Boyds Tire standards. COMPENSATION: Pay ranges from $25- $40 per hour depending on experience (hourly plus flag rate). Principal Duties and Responsibilities: Diagnoses vehicles according to the appropriate level of certifications/experience. Performs work as outlined on the Multi-point Inspection and/or Repair Order with efficiency and accuracy. Explains technical diagnosis and needed repairs to non-mechanical individuals which may include the Store Manager, Service Consultants and/or customers. Recommends services that are necessary to keep the customers vehicle in running condition; properly documents all recommendations in customer file. Follows all safety procedures and reports any concerns to the Shop Foreman or Store Manager. Maintains appropriate ASE certifications and renewals of expiring certifications. Automotive Technician Benefits: Competitive Bi-Weekly Pay Tuition Reimbursement Paid Vacation and Sick Time 6 Paid Holidays Medical, Dental and Vision Insurance Life Insurance (Company paid) 401(k) Retirement Savings Plan with Company Match Discounted Services on Personal and Immediate Family Vehicles Opportunity for Advancement! Qualifications: Prefer a minimum of one unexpired ASE or equivalent experience or training (3+ years of senior level experience). Possess valid drivers license Must be at least 18 years of age Ability to work a minimum of five days, including Saturdays. Sun Auto Tire & Service provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. RequiredPreferredJob Industries Automotive
    $25-40 hourly 16d ago

Learn more about hadoop developer jobs

Browse computer and mathematical jobs