ETL/ELT Data Engineer (Secret Clearance) - Hybrid
Austin, TX jobs
LaunchCode is recruiting for a Software Data Engineer to work at one of our partner companies!
Details:
Full-Time W2, Salary
Immediate opening
Hybrid - Austin, TX (onsite 1-2 times a week)
Pay $85K-$120K
Minimum Experience: 4 years
Security Clearance: Active DoD Secret Clearance
Disclaimer: Please note that we are unable to provide work authorization or sponsorship for this role, now or in the future. Candidates requiring current or future sponsorship will not be considered.
Job description
Job Summary
A Washington, DC-based software solutions provider founded in 2017, specializes in delivering mission-critical and enterprise solutions to the federal government. Originating from the Department of Defense's software factory ecosystem, the company focuses on Command and Control, Cybersecurity, Space, Geospatial, and Modeling & Simulation. The company leverages commercial technology to enhance the capabilities of the DoD, IC, and their end-users, with innovation driven by its Innovation centers. The company has a presence in Boston, MA, Colorado Springs, CO, San Antonio, TX, and St. Louis, MO.
Why the company?
Environment of Autonomy
Innovative Commercial Approach
People over process
We are seeking a passionate Software Data Engineer to support the Army Software Factory (ASWF) in aligning with DoDM 8140.03 Cyber Workforce requirements and broader compliance mandates. The Army Software Factory (ASWF), a first-of-its-kind initiative under Army Futures Command, is revolutionizing the Army's approach to software development by training and employing self-sustaining technical talent from across the military and civilian workforce. Guided by the motto “By Soldiers, For Soldiers,” ASWF equips service members to develop mission-critical software solutions independently-especially vital for future contested environments where traditional technical support may be unavailable. This initiative also serves as a strategic prototype to modernize legacy IT processes and build technical readiness across the force to ensure battlefield dominance in the digital age.
Required Skills:
Active DoD Secret Clearance (Required)
4+ years of experience in data science, data engineering, or similar roles.
Expertise in designing, building, and maintaining scalable ETL/ELT pipelines using tools and languages such as Python, SQL, Apache Spark, or Airflow.
Strong proficiency in working with relational and NoSQL databases, including experience with database design, optimization, and query performance tuning (e.g., PostgreSQL, MySQL, MongoDB, Cassandra).
Demonstrable experience with cloud data platforms and services (e.g., AWS Redshift, S3, Glue, Athena; Azure Data Lake, Data Factory, Synapse; Google BigQuery, Cloud Storage, Dataflow).
Solid understanding of data warehousing concepts (e.g., Kimball, Inmon methodologies) and experience with data modeling for analytical purposes.
Proficiency in at least one programming language commonly used in data engineering (e.g., Python, Java, Scala) for data manipulation, scripting, and automation.
CompTIA Security+ Certified or otherwise DoDM 8140.03 (formerly DoD 8570.01-M) compliant.
Nice to Have:
Familiarity with SBIR technologies and transformative platform shifts
Experience working in Agile or DevSecOps environments
2+ years of experience interfacing with Platform Engineers and data visibility team, manage AWS resources, and GitLab admin
#LI-hybrid #austintx #ETLengineer #dataengineer #army #aswf #clearancejobs #clearedjobs #secretclearance #ETL
Senior Data Engineer
Nashville, TN jobs
Concert is a software and managed services company that promotes health by providing the digital infrastructure for reliable and efficient management of laboratory testing and precision medicine. We are wholeheartedly dedicated to enhancing the transparency and efficiency of health care. Our customers include health plans, provider systems, laboratories, and other important stakeholders. We are a growing organization driven by smart, creative people to help advance precision medicine and health care. Learn more about us at ***************
YOUR ROLE
Concert is seeking a skilled Senior Data Engineer to join our team. Your role will be pivotal in designing, developing, and maintaining our data infrastructure and pipelines, ensuring robust, scalable, and efficient data solutions. You will work closely with data scientists, analysts, and other engineers to support our mission of automating the application of clinical policy and payment through data-driven insights.
You will be joining an innovative, energetic, passionate team who will help you grow and build skills at the intersection of diagnostics, information technology and evidence-based clinical care.
As a Senior Data Engineer you will:
Design, develop, and maintain scalable and efficient data pipelines using AWS services such as Redshift, S3, Lambda, ECS, Step Functions, and Kinesis Data Streams.
Implement and manage data warehousing solutions, primarily with Redshift, and optimize existing data models for performance and scalability.
Utilize DBT (data build tool) for data transformation and modeling, ensuring data quality and consistency.
Develop and maintain ETL/ELT processes to ingest, process, and store large datasets from various sources.
Work with SageMaker for machine learning data preparation and integration.
Ensure data security, privacy, and compliance with industry regulations.
Collaborate with data scientists and analysts to understand data requirements and deliver solutions that meet their needs.
Monitor and troubleshoot data pipelines, identifying and resolving issues promptly.
Implement best practices for data engineering, including code reviews, testing, and automation.
Mentor junior data engineers and share knowledge on data engineering best practices.
Stay up-to-date with the latest advancements in data engineering, AWS services, and related technologies.
After 3 months on the job you will have:
Developed a strong understanding of Concert's data engineering infrastructure
Learned the business domain and how it maps to the information architecture
Made material contributions towards existing key results
After 6 months you will have:
Led a major initiative
Become the first point of contact when issues related to the data warehouse are identified
After 12 months you will have:
Taken responsibility for the long term direction of the data engineering infrastructure
Proposed and executed key results with an understanding of the business strategy
Communicated the business value of major technical initiatives to key non-technical business stakeholders
WHAT LEADS TO SUCCESS
Self-Motivated A team player with a positive attitude and a proactive approach to problem-solving.
Executes Well You are biased to action and get things done. You acknowledge unknowns and recover from setbacks well.
Comfort with Ambiguity You aren't afraid of uncertainty and blazing new trails, you care about building towards a future that is different from today.
Technical Bravery You are comfortable with new technologies and eager to dive in to understand data in the raw and in its processed states.
Mission-focused You are personally motivated to drive more affordable, equitable and effective integration of genomic technologies into clinical care.
Effective Communication You build rapport and great working relationships with senior leaders, peers, and use the relationships you've built to drive the company forward
RELEVANT SKILLS & EXPERIENCE
Minimum of 4 years experience working as a data engineer
Bachelor's degree in software or data engineering or comparable technical certification / experience
Ability to effectively communicate complex technical concepts to both technical and non-technical audiences.
Proven experience in designing and implementing data solutions on AWS, including Redshift, S3, Lambda, ECS, and Step Functions
Strong understanding of data warehousing principles and best practices
Experience with DBT for data transformation and modeling.
Proficiency in SQL and at least one programming language (e.g., Python, Scala)
Familiarity or experience with the following tools / concepts are a plus: BI tools such as Metabase; Healthcare claims data, security requirements, and HIPAA compliance; Kimball's dimensional modeling techniques; ZeroETL and Kinesis data streams
COMPENSATION
Concert is seeking top talent and offers competitive compensation based on skills and experience. Compensation will commensurate with experience. This position will report to the VP of Engineering.
LOCATION
Concert is based in Nashville, Tennessee and supports a remote work environment.
For further questions, please contact: ******************.
Senior Data Engineer
Charlotte, NC jobs
**NO 3rd Party vendor candidates or sponsorship**
Role Title: Senior Data Engineer
Client: Global construction and development company
Employment Type: Contract
Duration: 1 year
Preferred Location: Remote based in ET or CT time zones
Role Description:
The Senior Data Engineer will play a pivotal role in designing, architecting, and optimizing cloud-native data integration and Lakehouse solutions on Azure, with a strong emphasis on Microsoft Fabric adoption, PySpark/Spark-based transformations, and orchestrated pipelines. This role will lead end-to-end data engineering-from ingestion through APIs and Azure services to curated Lakehouse/warehouse layers-while ensuring scalable, secure, well-governed, and well-documented data products. The ideal candidate is hands-on in delivery and also brings data architecture knowledge to help shape patterns, standards, and solution designs.
Key Responsibilities
Design and implement end-to-end data pipelines and ELT/ETL workflows using Azure Data Factory (ADF), Synapse, and Microsoft Fabric.
Build and optimize PySpark/Spark transformations for large-scale processing, applying best practices for performance tuning (partitioning, joins, file sizing, incremental loads).
Develop and maintain API-heavy ingestion patterns, including REST/SOAP integrations, authentication/authorization handling, throttling, retries, and robust error handling.
Architect scalable ingestion, transformation, and serving solutions using Azure Data Lake / OneLake, Lakehouse patterns (Bronze/Silver/Gold), and data warehouse modeling practices.
Implement monitoring, logging, alerting, and operational runbooks for production pipelines; support incident triage and root-cause analysis.
Apply governance and security practices across the lifecycle, including access controls, data quality checks, lineage, and compliance requirements.
Write complex SQL, develop data models, and enable downstream consumption through analytics tools and curated datasets.
Drive engineering standards: reusable patterns, code reviews, documentation, source control, and CI/CD practices.
Requirements:
Bachelor's degree (or equivalent experience) in Computer Science, Engineering, or a related field.
5+ years of experience in data engineering with strong focus on Azure Cloud.
Strong experience with Azure Data Factory pipelines, orchestration patterns, parameterization, and production support.
Strong hands-on experience with Synapse (pipelines, SQL pools and/or Spark), and modern cloud data platform patterns.
Advanced PySpark/Spark experience for complex transformations and performance optimization.
Heavy experience with API-based integrations (building ingestion frameworks, handling auth, pagination, retries, rate limits, and resiliency).
Strong knowledge of SQL and data warehousing concepts (dimensional modeling, incremental processing, data quality validation).
Strong understanding of cloud data architectures including Data Lake, Lakehouse, and Data Warehouse patterns.
Preferred Skills
Experience with Microsoft Fabric (Lakehouse/Warehouse/OneLake, Pipelines, Dataflows Gen2, notebooks).
Architecture experience (formal or informal), such as contributing to solution designs, reference architectures, integration standards, and platform governance.
Experience with DevOps/CI-CD for data engineering using Azure DevOps or GitHub (deployment patterns, code promotion, testing).
Experience with Power BI and semantic model considerations for Lakehouse/warehouse-backed reporting.
Familiarity with data catalog/governance tooling (e.g., Microsoft Purview).
Data Engineer (Databricks)
Columbus, OH jobs
ComResource is searching for a highly skilled Data Engineer with a background in SQL and Databricks that can handle the design and construction of scalable management systems, ensure that all data systems meet company requirements, and also research new uses for data acquisition.
Requirements:
Design, construct, install, test and maintain data management systems.
Build high-performance algorithms, predictive models, and prototypes.
Ensure that all systems meet the business/company requirements as well as industry practices.
Integrate up-and-coming data management and software engineering technologies into existing data structures.
Develop set processes for data mining, data modeling, and data production.
Create custom software components and analytics applications.
Research new uses for existing data.
Employ an array of technological languages and tools to connect systems together.
Recommend different ways to constantly improve data reliability and quality.
Qualifications:
5+ years data quality engineering
Experience with Cloud-based systems, preferably Azure
Databricks and SQL Server testing
Experience with ML tools and LLMs
Test automation frameworks
Python and SQL for data quality checks
Data profiling and anomaly detection
Documentation and quality metrics
Healthcare data validation experience preferred
Test automation and quality process development
Plus:
Azure Databricks
Azure Cognitive Services integration
Databricks Foundational model Integration
Claude API implementation a plus
Python and NLP frameworks (spa Cy, Hugging Face, NLTK)
Senior Data Analytics Engineer
Columbus, OH jobs
We are seeking a highly skilled Analytics Data Engineer with deep expertise in building scalable data solutions on the AWS platform. The ideal candidate is a 10/10 expert in Python and PySpark, with strong working knowledge of SQL. This engineer will play a critical role in translating business and end-user needs into robust analytics products-spanning ingestion, transformation, curation, and enablement for downstream reporting and visualization.
You will work closely with both business stakeholders and IT teams to design, develop, and deploy advanced data pipelines and analytical capabilities that power enterprise decision-making.
Key Responsibilities
Data Engineering & Pipeline Development
Design, develop, and optimize scalable data ingestion pipelines using Python, PySpark, and AWS native services.
Build end-to-end solutions to move large-scale big data from source systems into AWS environments (e.g., S3, Redshift, DynamoDB, RDS).
Develop and maintain robust data transformation and curation processes to support analytics, dashboards, and business intelligence tools.
Implement best practices for data quality, validation, auditing, and error-handling within pipelines.
Analytics Solution Design
Collaborate with business users to understand analytical needs and translate them into technical specifications, data models, and solution architectures.
Build curated datasets optimized for reporting, visualization, machine learning, and self-service analytics.
Contribute to solution design for analytics products leveraging AWS services such as AWS Glue, Lambda, EMR, Athena, Step Functions, Redshift, Kinesis, Lake Formation, etc.
Cross-Functional Collaboration
Work with IT and business partners to define requirements, architecture, and KPIs for analytical solutions.
Participate in Daily Scrum meetings, code reviews, and architecture discussions to ensure alignment with enterprise data strategy and coding standards.
Provide mentorship and guidance to junior engineers and analysts as needed.
Engineering (Supporting Skills)
Employ strong skills in Python, Pyspark and SQL to support data engineering tasks, broader system integration requirements, and application layer needs.
Implement scripts, utilities, and micro-services as needed to support analytics workloads.
Required Qualifications
5+ years of professional experience in data engineering, analytics engineering, or full-stack data development roles.
Expert-level proficiency (10/10) in:
Python
PySpark
Strong working knowledge of:
SQL and other programming languages
Demonstrated experience designing and delivering big-data ingestion and transformation solutions through AWS.
Hands-on experience with AWS services such as Glue, EMR, Lambda, Redshift, S3, Kinesis, CloudFormation, IAM, etc.
Strong understanding of data warehousing, ETL/ELT, distributed computing, and data modeling.
Ability to partner effectively with business stakeholders and translate requirements into technical solutions.
Strong problem-solving skills and the ability to work independently in a fast-paced environment.
Preferred Qualifications
Experience with BI/Visualization tools such as Tableau
Experience building CI/CD pipelines for data products (e.g., Jenkins, GitHub Actions).
Familiarity with machine learning workflows or MLOps frameworks.
Knowledge of metadata management, data governance, and data lineage tools.
Salesforce Developer- Hybrid
Audubon, PA jobs
Salesforce Developer is needed for a contract opportunity with our client located in Pennsylvania. Job Details Duration: 24 months contract Hourly pay rate: : $52.82- $59.86 ** Client is not able to do sponsorship of visa or C2C**
Key Responsibilities
• Design and implement scalable, high performance salesforce solutions aligned with business goals and industry standards.
• Collaborate across teams to troubleshoot and resolve integration and development issues within the salesforce environment
• Identify and implement platform improvements, new features and bug fixes to continuously improve system performance.
• Communicate regularly with technical stakeholders to provide updates, resolve challenges, and contribute innovative ideas.
Required skills and Experience
• Bachelor's degree in Computer Science, Information Technology, or a related field or equivalent hands-on experience
• 5+ years of proven experience in salesforce development, including work with data visualizations, integrations, and complex technical requirements.
• Hands-on experience with salesforce service cloud, salesforce Experience sites
• Strong command of development in Aura, LWC, flows, JSON, XML JavaScript and SOQL.
• Familiar with Agile methodologies and tools like Jira or equivalent platforms.
• Knowledge of salesforce security standards and data compliance process
• Proactive problem solver with strong analytical abilities
• Clear and concise communicator, comfortable translating complex ideas to diverse audiences.
• Additional preference to the candidates with experience in Account engagement/Pardot functionality.
Estimated Min Rate: $52.82
Estimated Max Rate: $59.86
What's In It for You?
We welcome you to be a part of the largest and legendary global staffing companies to meet your career aspirations. Yoh's network of client companies has been employing professionals like you for over 65 years in the U.S., UK and Canada. Join Yoh's extensive talent community that will provide you with access to Yoh's vast network of opportunities and gain access to this exclusive opportunity available to you. Benefit eligibility is in accordance with applicable laws and client requirements. Benefits include:
Medical, Prescription, Dental & Vision Benefits (for employees working 20+ hours per week)
Health Savings Account (HSA) (for employees working 20+ hours per week)
Life & Disability Insurance (for employees working 20+ hours per week)
MetLife Voluntary Benefits
Employee Assistance Program (EAP)
401K Retirement Savings Plan
Direct Deposit & weekly epayroll
Referral Bonus Programs
Certification and training opportunities
Note: Any pay ranges displayed are estimations. Actual pay is determined by an applicant's experience, technical expertise, and other qualifications as listed in the job description. All qualified applicants are welcome to apply.
Yoh, a Day & Zimmermann company, is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Visit ************************************************ to contact us if you are an individual with a disability and require accommodation in the application process.
For California applicants, qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. All of the material job duties described in this posting are job duties for which a criminal history may have a direct, adverse, and negative relationship potentially resulting in the withdrawal of a conditional offer of employment.
It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.
By applying and submitting your resume, you authorize Yoh to review and reformat your resume to meet Yoh's hiring clients' preferences. To learn more about Yoh's privacy practices, please see our Candidate Privacy Notice: **********************************
Android Developer
Columbus, OH jobs
Hello,
My name is Pradeep Bhondve, and I work as a Technical Recruiter for K-Tek Resourcing.
We are searching for Professionals below business requirements for one of our clients. Please read through the requirements and connect with us in case it suits your profile.
Please see the and if you feel Interested then send me your updated resume at ********************************** or give me a call at
*************
.
Linkedin Profile: ******************************************************
Job Title: Android Developer
Location: Columbus, OH - Relocation will work from EST and CST Time Zone
Duration: Long Term
Job Description -
Skill Set: Kotlin, Android SDK, Jetpack Compose, Dagger, Coroutines, Junit, MVVM
Android Developer
Columbus, OH jobs
Role: Android Developer
Type: Long term contract
Exp: 8+ Years
Rate: Up to $70/hr
Note: PP number is madatory
Job Description:
Mandatory Skills: Android, Java, Kotlin, UI Development, MVVM, Expert in Coding.
8+ years of combined software/application development experience in Java, Android SDK, Kotlin, UI Development
Experience in developing, deploying, and/or supporting an enterprise size solution
Experience with all phases of the development life cycle
Experience with the following is desired:
Familiarity with Agile development including daily scrum and weekly iteration reviews and planning
Enthusiasm for automated testing
Experience with unit testing frameworks
Experience with source control management.
Workday Application Developer (Expert level)
Towson, MD jobs
***Must be on VLS W2***
Hybrid: A work from home schedule will be established with the hiring manager once onboarding and equipment issuing is completed. This individual may work remotely up to two (2) days per week, equivalent to approximately 40% remote and 60% onsite. Fully remote or out-of-state employment is not authorized at this time. Must be in MD or willing to relocate to Baltimore area, MD//DC Metro area after getting selected on their own expense.
Background:
Client is seeking a Workday Application Developer to assist with the implementation and ongoing support of Workday HCM and Financial System
Job Description:
Development and Configuration:
• Design, build, test, and deploy Workday integrations using tools such as EIB, Studio, Core Connectors, and Web Services (SOAP/REST APIs)
• Configure and maintain Workday business processes, calculated fields, custom validations, and condition rules
• Develop and maintain Workday custom reports
• Troubleshoot and resolve system or integration issues
• Integration Management
• Create and maintain integrations between Workday and third-party systems (HR, Payroll, Benefits, Finance, etc.)
• Monitor scheduled integrations and address failures or data inconsistencies
• Document integration design, configuration, and data flow processes
• Optimize existing integrations to improve performance and reliability
Reporting & Analytics:
• Develop and maintain Workday reports and dashboards to support business and compliance needs
• Work with users to understand reporting requirements and translate them into Workday solutions
• Automate report scheduling and distribution for Users
Support and Collaboration:
• Provide technical support for agency/end users
• Maintain system documentation, including design specs, workflows, and configuration guides
• Collaborate with HR, Payroll, Finance, and IT teams to align Workday functionality with business needs
• Participate in change control and configuration management processes
• Gather/document business requirements for change requests (break/fix)
Minimum Qualifications/Skill Sets
• Graduation from an accredited college or university with a bachelor's degree in business or computer science. Plus at least two years' experience in the administration or support of enterprise systems such web services, Workday HCM, Payroll, Financials and Reporting. Additional experience may be substituted on a year-for-year basis up to a maximum of four years for the required education
Preferred Qualifications/Skill Sets
• Strong analytical and problem-solving skills
• Outstanding business process fluency, with a strong ability to discuss processes at a level of detail sufficient to gain insight into the underlying business problem or opportunity
• Ability to work in a team environment; establishing and maintaining strong professional relationships
• Proven ability and success with HCM, Payroll and Financial systems
• Maintain the security and confidentiality of any proprietary or sensitive data in any medium
• Demonstrated Workday experience
• Demonstrated experience in Workday implementation or post-production environment as a primary technical resource
• Workday certified training or commensurate experience
Security Requirements (if applicable)
• Security badges will be issued by the County and must be worn at all times when the candidate is in County facilities.
• The selected candidate must pass a comprehensive security background check by the Baltimore County Police Department in order to be hired into employment
Application Developer
Newark, OH jobs
Manifest Solutions is currently seeking an Application Developer for a hybrid position in Newark, OH.
- MUST be within commuting distance of Newark, OH.
NO C2C; NO 3rd Parties - Applicants ONLY
Decommission and move apps to ServiceNow
Re-writing apps in .Net
Designs, codes, tests, documents, releases, and supports custom software applications to meet specific technology needs.
Gather and analyze functional business/user requirements.
Define the technical requirements for the custom software by analyzing systems and processes-including their dependencies and interactions-in the context of the business' technology need(s).
Prepare Scope of Work that describe in detail the components that will be developed and the methods that will be used - including written requirements and time estimates; entity relationship, user flow, and data flow diagrams; permissions and roles matrices; and other applicable design artifacts.
Create prototypes that enable business users to verify that functionality will meet the specific business technology need(s).
Develop software solution(s) in accordance with the business and technical requirements by writing and implementing source code.
Test and debug source code.
Develop and maintain technical documentation that represents the current state design and code of custom software applications.
Develop and maintain user guides that describe the features and functionality of custom software applications.
Periodically evaluate existing custom software applications to assess code quality and functional integrity over time.
Update existing custom software applications as necessary to fix errors, adapt to new hardware, improve performance, refactor code, enhance interfaces and/or implement new features/functionality.
Qualifications
2 -4 yrs - Applied specialized experience with the following: Angular 4+, .NET, C# 7+, SQL, VS Code/Visual Studio, GitLab, PostMan, API Design and Development
Formal education in software development, web development, or similar focus or equivalent professional experience
Familiar with and able to adhere to a formal SDLC program
Understands and can implement Secure Coding Practices (e.g. OWASP Top 10)
Experience doing containerized development
Cloud Development experience (Azure, AWS)
Familiarity with the following types of testing: User Testing, Integration Testing, Isolation Testing, Load Testing, End to End Testing, Vulnerability Testing
SAP Development Lead
Cincinnati, OH jobs
Job Title: SAP Development Lead - Commerce Manage & Pay
Compensation: $125,000 - $145,000
Who We Are:
Vernovis is a Total Talent Solutions company that specializes in Technology, Cybersecurity, Finance & Accounting functions. At Vernovis, we help professionals achieve their career goals by matching them with innovative projects and dynamic direct hire opportunities in Ohio and across the Midwest.
What You'll Do:
Oversee and work on major projects spanning a broad range of systems.
Provide subject matter expertise and technical direction.
Collaborate with customers and team members to understand business requirements that drive analysis and design of technical solutions.
Ensure solutions align with business and IT strategies and comply with organizational architectural standards.
Perform application analysis, design, development, integration, and enhancement work.
Address and resolve complex support issues.
Establish Hybris application development standards and solutions.
Assist in developing processes and procedures for software development and delivery.
Execute development changes according to defined methodologies and document efforts per business and change management requirements.
Provide technical recommendations for design and architecture improvements.
Build and maintain strong working relationships with IT and business teams.
What Experience You'll Have:
Required:
Bachelor's degree (or foreign equivalent) in a Business or Technical-related field.
Minimum 8 years of experience in each of the following:
Hybris/eCommerce, including configuring and customizing the Hybris application.
Java Enterprise frameworks or similar, with proven ability to understand class structures and coding frameworks.
Minimum 5 years of experience in each of the following:
SAP/Hybris techno-functional experience implementing eCommerce and integrating with fulfillment and customer management systems.
Establishing and maintaining SAP/Hybris development practices and procedures.
Solution design and architecture of large, complex websites or web applications (web content management or eCommerce).
Creating transports and tasks for change management.
SAP experience in a Fortune 1000 or similar enterprise, including requirements gathering and customization.
Using an Integrated Development Environment (IDE); ability to set up a local development environment, utilize GIT tools, install and configure the JDK, and register it for use with Eclipse or similar IDEs.
Minimum 3 years of experience providing technical and functional leadership, coaching, and mentoring team members, with success leading enterprise-wide software development projects.
Proven ability to collaborate with project teams and translate requirements into design and solutions.
Minimum 1 year of experience communicating technical and business issues/solutions to all levels of management.
Experience with at least one full life cycle implementation in a functional/technical area.
Must Have Experience In:
Implementing enterprise B2B and B2C eCommerce solutions.
SAML 2.0 integrations.
Credit card processing solution implementations.
JDBC, EJB3, and Hibernate.
Oracle SQL, MS SQL, or similar SQL databases.
SOAP Web Services.
Data Engineer, Enterprise Data, Analytics and Innovation
Remote
at Vaniam Group
Data Engineer, Enterprise Data, Analytics and Innovation, Digital Innovation
What You'll Do
Are you passionate about building robust data infrastructure and enabling innovation through engineering excellence? As our Data Engineer, your goal is to own and evolve the foundation of our data infrastructure. You will be central in ensuring data reliability, scalability, and accessibility across our lakehouse and transactional systems. This role is ideal for someone who thrives at the intersection of engineering and innovation, ensuring our data platforms are robust today while enabling the products of tomorrow.
A Day in the Life
Lakehouse and Pipelines
Design, build, and operate reliable ETL and ELT pipelines in Python and SQL
Manage ingestion into Bronze, standardization and quality in Silver, and curated serving in Gold layers of our Medallion architecture
Maintain ingestion from transactional MySQL systems into Vaniam Core to keep production data flows seamless
Implement observability, data quality checks, and lineage tracking to ensure trust in all downstream datasets
Data Modeling and Governance
Develop schemas, tables, and views optimized for analytics, APIs, and product use cases
Apply and enforce best practices for security, privacy, compliance, and access control, ensuring data integrity across sensitive healthcare domains
Maintain clear and consistent documentation for datasets, pipelines, and operating procedures
Integration of New Data Sources
Lead the integration of third-party datasets, client-provided sources, and new product-generated data into Vaniam Core
Partner with product and innovation teams to build repeatable processes for onboarding new data streams
Ensure harmonization, normalization, and governance across varied data types (scientific, engagement, operational)
Analytics and Predictive Tools
Collaborate with the innovation team to prototype and productionize analytics, predictive features, and decision-support tools
Support dashboards, APIs, and services that activate insights for internal stakeholders and clients
Work closely with Data Science and AI colleagues to ensure engineered pipelines meet modeling and deployment requirements
Reliability and Optimization
Monitor job execution, storage, and cluster performance, ensuring cost efficiency and uptime
Troubleshoot and resolve data issues, proactively addressing bottlenecks
Conduct code reviews, enforce standards, and contribute to CI/CD practices for data pipelines
What You Must Have
Education and Experience
5+ years of professional experience in data engineering, ETL, or related roles
Strong proficiency in Python and SQL for data engineering
Hands-on experience building and maintaining pipelines in a lakehouse or modern data platform
Practical understanding of Medallion architectures and layered data design
Skills and Competencies
Familiarity with modern data stack tools, including:
Spark or PySpark
Workflow orchestration (Airflow, dbt, or similar)
Testing and observability frameworks
Containers (Docker) and Git-based version control
Excellent communication skills, problem-solving mindset, and a collaborative approach
What You Might Have, but Isn't Required
Experience with Databricks and the Microsoft Azure ecosystem
Expertise with Delta Lake formats, metadata management, and data catalogs
Familiarity with healthcare, scientific, or engagement data domains
Experience exposing analytics through APIs or lightweight microservices
The Team You'll Work Closest With
You will collaborate closely with the innovation team to prototype and productionize analytics solutions. Your main contacts will be Data Science and AI colleagues, product and innovation leaders, and internal stakeholders who rely on data-driven insights. You will work remotely with flexibility, growth opportunities, and the ability to influence how data shapes the future of medical communications, helping to turn raw data into client-ready insights that enable measurable healthcare impact.
Top of Form
Why You'll Love Us:
100% remote environment with opportunities for local meet-ups
Positive, diverse, and supportive culture
Passionate about serving clients focused on Cancer and Blood diseases
Investment in you with opportunities for professional growth and personal development through Vaniam Group University
Health benefits - medical, dental, vision
Generous parental leave benefit
Focused on your financial future with a 401(k) Plan and company match
Work-Life Balance and Flexibility
Flexible Time Off policy for rest and relaxation
Volunteer Time Off for community involvement
Emphasis on Personal Wellness
Virtual workout classes
Discounts on tickets, events, hotels, child care, groceries, etc.
Employee Assistance Programs
Salary offers are based upon several factors including experience, education, skills, training, demonstrated qualifications, location, and organizational need. The range for this role is $110,000 - $125,000. Salary is one component of the total earnings and rewards package offered.
About Us: Vaniam Group is a people-first, purpose-driven, independent network of healthcare and scientific communications agencies committed to helping biopharmaceutical companies realize the full potential of their compounds in the oncology and hematology marketplace. Founded in 2007 as a virtual-by-design organization, Vaniam Group harnesses the talents and expertise of team members around the world. For more information, visit ******************** Applicants have rights under Federal Employment Laws to the following resources:
Family & Medical Leave Act (FMLA) poster - *********************************************
EEOC Know Your Rights poster - ***************************
Employee Polygraph Protection Act (EPPA) poster - *************************************************************************
Bottom of Form
Auto-ApplyStaff Data Engineer
Remote
At NerdWallet, we're on a mission to bring clarity to all of life's financial decisions and every great mission needs a team of exceptional Nerds. We've built an inclusive, flexible, and candid culture where you're empowered to grow, take smart risks, and be unapologetically yourself (cape optional). Whether remote or in-office, we support how you thrive best. We invest in your well-being, development, and ability to make an impact because when one Nerd levels up, we all do.
Data engineers are the builders behind the insights that drive smarter decisions. They design and scale reliable data pipelines and models that power analytics, experimentation, and strategic decision-making across the company. As a Staff Data Engineer, you'll tackle complex, cross-functional data challenges-partnering closely with stakeholders across product, engineering, and business teams. You'll combine strong technical expertise with clear communication and thoughtful collaboration to ensure our data systems are not only technically sound but also deeply aligned with NerdWallet's strategic goals.
As part of our embedded data model, you'll work directly within a product vertical-shaping the data that drives business decisions, product innovation, and user experiences. This is a unique opportunity to see your work translate into real-world outcomes, accelerating NerdWallet's mission through data that's closer than ever to the business.
You'll design, develop, and maintain data systems and pipelines that serve as the foundation for analytics and product innovation in a fast-paced, ever-evolving environment. The right candidate thrives in ambiguity-comfortable toggling between projects, adapting to shifting priorities, and leading through change. You'll elevate the team's impact by leveraging both your technical depth and your ability to influence, mentor, and foster a culture of innovation, reliability, and continuous improvement.
This role sits within Core Engineering and reports to a Senior Manager of Data Engineering. You'll join a passionate team of Nerds who believe clean, scalable data is at the heart of helping consumers make smarter financial decisions.
Where you can make an impact:
Lead the design, development, and maintenance of business-critical data assets, ensuring they are accurate, reliable, and aligned with evolving business priorities
Drive technical innovation and process excellence, evaluating emerging technologies and implementing scalable, efficient solutions that improve data pipeline performance and reliability
Tackle complex technical challenges - balancing scalability, security, and performance - while providing clear rationale for architectural decisions and aligning outcomes across teams
Ensure data pipeline reliability and observability, proactively identifying and resolving issues, investigating anomalies, and improving monitoring to safeguard data integrity
Build trust and alignment across cross-functional teams through transparent communication, collaborative problem-solving, and a deep understanding of partner needs
Bring clarity and direction to ambiguity, taking ownership of initiatives that span multiple domains or teams, and providing technical leadership to ensure successful delivery
Prioritize work strategically, balancing business impact, risk, and execution to drive measurable outcomes that support organizational goals
Act as a trusted technical advisor and thought leader, shaping the team's long-term architecture and influencing best practices
Foster a culture of technical excellence and continuous learning, mentoring engineers and championing modern data engineering practices, including AI and automation-enabled solutions
Your experience:
7+ years of relevant professional experience in data engineering
5+ years of experience with AWS, Snowflake, DBT, Airflow
Advanced level of proficiency in Python and SQL
Working knowledge of relational databases and query performance tuning (SQL)
Working knowledge of streaming technologies such as Storm, Kafka, Kinesis, and Flume
Bachelor's or Master's degree in Computer Science, Engineering, or a related field (or equivalent professional experience)
Advanced level of proficiency applying principles of logical thinking to define problems, collect data, establish facts, and draw valid conclusions
Experience designing, building, and operating robust data systems with reliable monitoring and logging practices
Strong communication skills, both written and verbal, with the ability to articulate information to team members of all levels and various amounts of applicable knowledge throughout the organization
Where:
This role will be remote (based in the U.S.).
We believe great work can be done anywhere. No matter where you are based, NerdWallet offers benefits and perks to support the physical, financial, and emotional well being of you and your family.
What we offer:
Work Hard, Stay Balanced (Life's a series of balancing acts, eh?)
Industry-leading medical, dental, and vision health care plans for employees and their dependents
Rejuvenation Policy - Flexible Vacation Time Off + 11 holidays + holiday company shutdown
New Parent Leave for employees with a newborn child or a child placed with them for adoption or foster care
Mental health support
Paid sabbatical after 5 years for Nerds to recharge, gain knowledge, and pursue their interests
Health and Dependent Care FSA and HSA Plan with monthly NerdWallet contribution
Monthly Wellness Stipend, Cell Phone Stipend, and Wifi Stipend (Only remote Nerds are eligible for the Wifi Stipend)
Work from home equipment stipend and co-working space subsidy (Only remote Nerds are eligible for these stipends)
Have Some Fun! (Nerds are fun, too)
Nerd-led group initiatives - Employee Resource Groups for Parents, Diversity, and Inclusion, Women, LGBTQIA, and other communities
Hackathons and team events across all teams and departments
Company-wide events like NerdLove (employee appreciation) and our annual Charity Auction
Our Nerds love to make an impact by paying it forward - Take 8 hours of volunteer time off per quarter and donate to your favorite causes with a company match
Plan for your future (And when you retire on your island, remember the little people)
401K with 4% company match
Be the first to test and benefit from our new financial products and tools
Financial wellness, guidance, and unlimited access to a Certified Financial Planner (CFP) through Northstar
Disability and Life Insurance with employer-paid premiums
If you are based in California, we encourage you to read this important information for California residents linked here.
NerdWallet is committed to pursuing and hiring a diverse workforce and is proud to be an equal opportunity employer. We prohibit discrimination and harassment on the basis of any characteristic protected by applicable federal, state, or local law, so all qualified applicants will receive consideration for employment.
NerdWallet will consider qualified applicants with a criminal history pursuant to the California Fair Chance Act and the San Francisco Fair Chance Act, which requires this notice, as well as the Los Angeles Fair Chance Act, which requires this notice.
NerdWallet participates in the Department of Homeland Security U.S. Citizenship and Immigration Services E-Verify program for all US locations. For more information, please see:
E-Verify Participation Poster (English+Spanish/Español)
Right to Work Poster (English) / (Spanish/Español)
#LI-DNP
#LI-4
#LI-MPLX
Auto-ApplyData Engineer (AI Enablement)
Remote
THE JOB / Data Engineer (AI Enablement) STRATEGY / Responsible for building and operating the data foundations that power Octagon's AI solutions and enterprise search. * Our headquarters are in Stamford, CT, but the location of this position can be 100% remote for qualified candidates.
You're a systems-minded builder who turns messy, multi-source data into reliable, searchable, and governed knowledge. Your mission is to stand up the pipelines, vector search, and metadata standards that make AI tools accurate, fast, and safe. You'll partner closely with the Solutions Engineer (peer role) to take prototypes and ship durable infrastructure-ingestion, embeddings, indexing, and APIs-so teams can find and use what they need. You'll report to the Director, Data Strategy and work across departments to reduce manual effort, improve data quality, and enable AI-powered workflows at scale.
THE WORK YOU'LL DO
* Data foundations: Design and operate the vector database/search layer (e.g., FAISS/pgvector/Milvus) and document-chunking/embedding pipelines that make Octagon's content discoverable and auditable.
* Scalable pipelines for AI/ML/LLM: Implement and maintain ELT/ETL to support downstream workflows such as data labeling, classification, and document parsing; build robust validations, lineage, and observability.
* Retrieval APIs: Expose governed retrieval endpoints that respect permissions (ACLs), support metadata filters, and return source snippets/IDs for grounding and citations.
* Data structuring & manipulation: Normalize, transform, and move JSON and other structured payloads cleanly through workflows to ensure reliable handoffs and automation outputs.
* Align & collaborate: Align product peers, design, data science, engineering, and commercial teams around a unified roadmap and shared data contracts.
* Operationalize prototypes: Take MVPs from the Solutions Engineer and productionize with CI/CD, telemetry, cost/usage guardrails, and pilot → rollout gating.
* Reliability & security: Build monitoring (freshness, re-index SLAs, retrieval quality), secrets management, access controls, and audit logging aligned with enterprise governance.
* Flexibility and willingness to travel and work weekends or holidays as needed. Anticipated travel level: Low (0-15%).
THE BIGGER TEAM YOU'LL JOIN
Recognized as one of the "Best Places to Work in Sports", Octagon is the global sports, entertainment, and experiential marketing arm of the Interpublic Group.
We take pride in being Playmakers - finding insightful, bold ways to create play in our work, our lives, and in the world. We believe in the power of play to create big ideas and unlock potential for our clients and talent.
We can put ourselves in the shoes of fans because we ARE fans - of sports, entertainment, and culture at large. This expertise allows us to continually evolve the fan experience across sports and entertainment alongside some of the biggest brands and talent in the world.
The world needs play more than ever. Are you a Playmaker?
WHO WE'RE LOOKING FOR
* 3+ years (or equivalent portfolio) building data systems: data modeling, ELT/ETL, Python + SQL; experience with cloud object storage and relational databases.
* Hands-on with embeddings and vector databases (e.g., FAISS/pgvector/Milvus) and document processing pipelines for RAG-style retrieval.
* Scalable pipeline experience supporting AI/ML/LLM use cases (labeling, classification, doc parsing) and partnering closely with Data Science and Data Labeling teams.
* Data structuring & manipulation expertise: cleanly normalizing and transforming JSON/Parquet/CSV payloads; designing resilient data contracts and schemas.
* Orchestration/ops: Airflow/Prefect (or similar), CI/CD, structured logging/monitoring, cost/usage guardrails; secure secrets management.
* Strong collaboration and communication skills; proven ability to align product/design/engineering/commercial stakeholders around a unified roadmap.
Nice-To-Haves
* Enterprise connectors and productivity stacks (e.g., Microsoft 365/SharePoint/Teams/Graph, Copilot or Copilot Studio/Power Automate; Google Workspace; Salesforce; DAMs).
* Experience implementing LLM inference patterns, similarity search, guardrails, and memory; familiarity with agent frameworks or custom orchestration.
* Additional languages for systems work (e.g., C++, C#, Java, or Go).
* Containers (Docker), GitHub Actions, IaC; lightweight internal UIs (Streamlit or R Shiny) to expose services.
* Familiarity with marketing/media-measurement datasets and associated normalization/quality checks.
The base range for this position is $90,000 - $100,000. Where an employee or prospective employee is paid within this range will depend on, among other factors, actual ranges for current/former employees in the subject position; market considerations; budgetary considerations; tenure and standing with the company (applicable to current employees); as well as the employee's/applicant's background pertinent experience, and qualifications
Octagon's comprehensive benefits package includes:
* Unlimited PTO policy - we understand you need time for play!
* Competitive medical/dental/vision insurance plans with FSA/HSA and Dependent Care FSA options. Pet Insurance for those who need it too!
* Generous Family and Parental Leave Policy (12 weeks) with eligibility extended to all parents regardless of gender or primary/secondary caregiver status
* Access to our parent company (IPG) Savings plan (401K program) with company match as well as an Employee Stock Purchase Plan (ESPP)
* Pretax Transportation/Commuter Benefits and Parent Travel Program
* Dedicated Mental Health resources including Headspace membership, Employee Assistance Program (CCA) and more
* Discount portal for everyday goods and services
* Employee Resource Groups and inclusive diversity programming and initiatives
* Personal Development programs
We make our careers website accessible to any and all users. If you need an accommodation to participate in the application process, please contact us at [email protected]. This email address is not for general employment inquiries or vendors; rather it is strictly for applicants who require special assistance accessing our employment website. Due to volume, messages sent to this email address that are not related to an accommodation cannot be answered.
Apply Now Cancel
New Business
[email protected]
Talent Representation
[email protected]
Press & Media
Alex Rozis:
[email protected]
Careers
View our
open positions
Regional Offices
Visit our
local offices
2025 Octagon.
* Privacy Notice
* Cookie Notice
* California Privacy Notice
* Terms and Conditions
* Modern Slavery Act Statement
Auto-ApplyData Engineer (AI Enablement)
Remote
THE JOB / Data Engineer (AI Enablement) STRATEGY / Responsible for building and operating the data foundations that power Octagon's AI solutions and enterprise search.
***Our headquarters are in Stamford, CT, but the location of this position can be 100% remote for qualified candidates.
You're a systems-minded builder who turns messy, multi-source data into reliable, searchable, and governed knowledge. Your mission is to stand up the pipelines, vector search, and metadata standards that make AI tools accurate, fast, and safe. You'll partner closely with the Solutions Engineer (peer role) to take prototypes and ship durable infrastructure-ingestion, embeddings, indexing, and APIs-so teams can find and use what they need. You'll report to the Director, Data Strategy and work across departments to reduce manual effort, improve data quality, and enable AI-powered workflows at scale.
THE WORK YOU'LL DO
Data foundations: Design and operate the vector database/search layer (e.g., FAISS/pgvector/Milvus) and document-chunking/embedding pipelines that make Octagon's content discoverable and auditable.
Scalable pipelines for AI/ML/LLM: Implement and maintain ELT/ETL to support downstream workflows such as data labeling, classification, and document parsing; build robust validations, lineage, and observability.
Retrieval APIs: Expose governed retrieval endpoints that respect permissions (ACLs), support metadata filters, and return source snippets/IDs for grounding and citations.
Data structuring & manipulation: Normalize, transform, and move JSON and other structured payloads cleanly through workflows to ensure reliable handoffs and automation outputs.
Align & collaborate: Align product peers, design, data science, engineering, and commercial teams around a unified roadmap and shared data contracts.
Operationalize prototypes: Take MVPs from the Solutions Engineer and productionize with CI/CD, telemetry, cost/usage guardrails, and pilot → rollout gating.
Reliability & security: Build monitoring (freshness, re-index SLAs, retrieval quality), secrets management, access controls, and audit logging aligned with enterprise governance.
Flexibility and willingness to travel and work weekends or holidays as needed. Anticipated travel level: Low (0-15%).
THE BIGGER TEAM YOU'LL JOIN
Recognized as one of the “Best Places to Work in Sports”, Octagon is the global sports, entertainment, and experiential marketing arm of the Interpublic Group.
We take pride in being Playmakers - finding insightful, bold ways to create play in our work, our lives, and in the world. We believe in the power of play to create big ideas and unlock potential for our clients and talent.
We can put ourselves in the shoes of fans because we ARE fans - of sports, entertainment, and culture at large. This expertise allows us to continually evolve the fan experience across sports and entertainment alongside some of the biggest brands and talent in the world.
The world needs play more than ever. Are you a Playmaker?
WHO WE'RE LOOKING FOR
3+ years (or equivalent portfolio) building data systems: data modeling, ELT/ETL, Python + SQL; experience with cloud object storage and relational databases.
Hands-on with embeddings and vector databases (e.g., FAISS/pgvector/Milvus) and document processing pipelines for RAG-style retrieval.
Scalable pipeline experience supporting AI/ML/LLM use cases (labeling, classification, doc parsing) and partnering closely with Data Science and Data Labeling teams.
Data structuring & manipulation expertise: cleanly normalizing and transforming JSON/Parquet/CSV payloads; designing resilient data contracts and schemas.
Orchestration/ops: Airflow/Prefect (or similar), CI/CD, structured logging/monitoring, cost/usage guardrails; secure secrets management.
Strong collaboration and communication skills; proven ability to align product/design/engineering/commercial stakeholders around a unified roadmap.
Nice-To-Haves
Enterprise connectors and productivity stacks (e.g., Microsoft 365/SharePoint/Teams/Graph, Copilot or Copilot Studio/Power Automate; Google Workspace; Salesforce; DAMs).
Experience implementing LLM inference patterns, similarity search, guardrails, and memory; familiarity with agent frameworks or custom orchestration.
Additional languages for systems work (e.g., C++, C#, Java, or Go).
Containers (Docker), GitHub Actions, IaC; lightweight internal UIs (Streamlit or R Shiny) to expose services.
Familiarity with marketing/media-measurement datasets and associated normalization/quality checks.
The base range for this position is $90,000 - $100,000. Where an employee or prospective employee is paid within this range will depend on, among other factors, actual ranges for current/former employees in the subject position; market considerations; budgetary considerations; tenure and standing with the company (applicable to current employees); as well as the employee's/applicant's background pertinent experience, and qualifications
We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, age, disability, gender identity, marital or veteran status, or any other protected class.
Senior Data Engineer
New York, NY jobs
Job Description
Senior Data Engineer $150k - $170k
We are a leading cloud-based mobile patient intake and registration system. Our platform allows patients to complete their paperwork from the comfort of their own homes using their smartphones or computers, minimizing in-person contact and streamlining the check-in process. With our fully customizable patient scheduling, intake, and payment platform, you can maximize efficiency and minimize waiting room activity. Our patient engagement solution also syncs seamlessly with your EMR system to keep records updated in real-time.
Role Description
This is a full-time position for a Senior Data Engineer. As a Senior Data Engineer, you will be responsible for the day-to-day tasks associated with data engineering, including data modeling, ETL (Extract Transform Load), data warehousing, and data analytics. This is a hybrid role, with the majority of work located in the New York City office but with flexibility for some remote work.
Qualifications
Data Engineering, Data Modeling, and ETL (Extract Transform Load) skills
Data Warehousing and Data Analytics skills
Experience with cloud-based data solutions
Strong problem-solving and analytical skills
Proficiency in programming languages such as Python, PySpark or Java
Experience with SQL and database management systems
Knowledge of healthcare data requirements and regulations is a plus
Bachelor's degree in Computer Science, Engineering, or related field
If interested, please send your resume to: rick@ingenium.agency
Sr Engineer Data Engineering - US Based Remote
Remote
Senior Data Engineer
At Anywhere, we work to build and improve a platform that helps real estate professionals work effectively and helps delight home buyers and sellers with an excellent experience. We do that by combining great technology with great people - and we're looking for a Senior Data Engineer to join our team.
What we're looking for:
You're a talented, creative, and motivated engineer who loves developing powerful, stable, and intuitive apps - and you're excited to work with a team of individuals with that same passion. You've accumulated years of experience, and you're excited about taking your mastery of Big Data and Java to a new level. You enjoy challenging projects involving big data sets and are cool under pressure. You're no stranger to fast-paced environments and agile development methodologies - in fact, you embrace them. With your strong analytical skills, your unwavering commitment to quality, your excellent technical skills, and your collaborative work ethic, you'll do great things here at Anywhere.
What you'll do:
As a Senior Data Engineer, you'll be responsible for building high performance, scalable data solutions that meet the needs of millions of agents, brokers, home buyers, and sellers. You'll design, develop, and test robust, scalable data platform components. You'll work with a variety of teams and individuals, including product engineers to understand their data pipeline needs and come up with innovative solutions. You'll work with a team of talented engineers and collaborate with product managers and designers to help define new data products and features.
Skills, accomplishments, interests you should have:
BS/MS in Computer Science, Engineering, or related technical discipline or equivalent combination of training and experience.
5+ years core Scala/Java experience: building business logic layers and high-volume/low latency/big data pipelines.
3+ years of experience in large scale real-time stream processing using Apache Flink or Apache Spark with messaging infrastructure like Kafka/Pulsar.
5+ years of experience on Data Pipeline development, ETL and processing of structured and unstructured data.
3+ years of experience using NoSQL systems like MongoDB, DynamoDB and Relational SQL Database systems (PostgreSQL) and Athena.
Experience with technologies like Lambda, API Gateway, AWS Fargate, ECS, CloudWatch, S3, DataDog.
Experience owning and implementing technical/data solutions or pipelines.
Excellent written and verbal communication skills in English.
Strong work ethic and entrepreneurial spirit.
Auto-ApplyDatabase Developer 1 (Remote)
Phoenix, AZ jobs
Prepares, defines, structures, develops, implements, and maintains database objects. Analyze query performance, identify bottlenecks, and implement optimization techniques. Defines and implements interfaces to ensure that various applications and user-installed or vendor-developed systems interact with the required database systems.
Creates database structures, writing and testing SQL queries, and optimizing database performance.
Plans and develops test data to validate new or modified database applications.
Work with business analysts, and other stakeholders to understand requirements and integrate database solutions.
Build and implement database systems that meet specific business requirements ensuring data integrity and security, as well as troubleshooting and resolving database issues.
Design and implement ETL pipelines to integrate data from various sources using SSIS.
Responsible for various SQL jobs.
Skills Required
Strong understanding of SQL and DBMS like MySQL, PostgreSQL, or Oracle.
Ability to design and model relational databases effectively.
Skills in writing and optimizing SQL queries for performance.
Ability to troubleshoot and resolve database-related issues.
Ability to communicate technical information clearly and concisely to both technical and non-technical audiences.
Ability to collaborate effectively with other developers and stakeholders.
Strong ETL experience specifically with SSIS.
Skills Preferred
Azure experience is a plus
.Net experience is a plus
GITHub experience is a plus
Experience Required
2 years of progressively responsible programming experience or an equivalent combination of training and experience.
Education Required
Bachelor`s degree in Information Technology or Computer Science or equivalent experience
Data Engineer (Hybrid)
Tampa, FL jobs
Job Title: Data Engineer
Workplace: Hybrid, 2-3 days per week onsite at MacDill AFB, FL
Clearance: Top Secret (TS)
Elder Research is seeking Data Engineers to support a U.S. national security client at MacDill AFB in Tampa, FL. In this mission-focused role, you will apply advanced data engineering techniques to enable intelligence analysts to uncover hidden patterns, enhance decision-making, and drive intelligence innovation in support of national security.
This hybrid role offers the opportunity to work at the cutting edge of analytics and defense, directly impacting military operations across the U.S. Intelligence and Defense community. Our team integrates expertise in data science, AI/ML, and intelligence operations to deliver data-driven solutions for the U.S. National Security enterprise. The work directly contributes to decision-making, mission readiness, and the ability of operators to succeed in a complex global battlespace.
Position Requirements:
Education: Bachelor s degree in a technical field (e.g., Engineering, Mathematics, Statistics, Physics, Computer Science, IT, or related discipline).
Clearance: active Top Secret (TS)
Years of Experience: 3+ years
Experience with:
Python, SQL, no SQL, Cypher, POSTGRES
SQLAlchemy, Swagger, Spark, Hadoop, Kafka, Hive, R,
Apache Storm, Neo4J, MongoDB
Cloud platforms (AWS, Azure, GCP, or similar)
Ability to work independently as well as within cross-functional teams.
Strong communication, problem-solving, and critical-thinking skills.
Preferred Skills and Qualifications:
Active TS/SCI clearance.
Experience supporting the intelligence domain, particularly Intelligence, Surveillance, and Reconnaissance (ISR).
Previous work supporting Special Operations Forces (SOF) missions or U.S. national security customers.
Why apply to this position at Elder Research?
Competitive Salary and Benefits
Important Work - Make a Difference supporting U.S. national security.
Job Stability: Elder Research is not a typical government contractor, we hire you for a career not just a contract.
People-Focused Culture: we prioritize work-life-balance and provide a supportive, positive, and collaborative work environment as well as opportunities for professional growth and advancement.
Company Stock Ownership: all employees are provided with shares of the company each year based on company value and profits.
About Elder Research, Inc
People Centered. Data Driven
Elder Research is a fast growing consulting firm specializing in predictive analytics. Being in the data mining business almost 30 years, we pride ourselves in our ability to find creative, cutting edge solutions to real-world problems. We work hard to provide the best value to our clients and allow each person to contribute their ideas and put their skills to use immediately.
Our team members are passionate, curious, life-long learners. We value humility, servant-leadership, teamwork, and integrity. We seek to serve our clients and our teammates to the best of our abilities. In keeping with our entrepreneurial spirit, we want candidates who are self-motivated with an innate curiosity and strong team work.
Elder Research believes in continuous learning and community - each week the entire company attends a Tech Talk and each office location provides lunch. Elder Research provides a supportive work environment with established parental, bereavement, and PTO policies. By prioritizing a healthy work-life balance - with reasonable hours, solid pay, low travel, and extremely flexible time off - Elder Research enables and encourages its employees to serve others and enjoy their lives.
Elder Research, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
Elder Research is a Government contractor and many of our positions require US Citizenship.
Senior Data Engineer
Blue Ash, OH jobs
Job Description
The Engineer is responsible for staying on track with key milestones in Customer Platform / Customer Data Acceleration, work will be on the new Customer Platform Analytics system in Databricks. The Engineer has overall responsibility in the technical design process. Leads and participates in the application technical design process and completes estimates and work plans for design, development, implementation, and rollout tasks. The Engineer also communicates with the appropriate teams to ensure that assignments are delivered with the highest of quality and in accordance to standards. The Engineer strives to continuously improve the software delivery processes and practices. Role model and demonstrate the companys core values of respect, honesty, integrity, diversity, inclusion and safety of others.
Current tools and technologies include:
Databricks and Netezza
Key Responsibilities
Lead and participate in the design and implementation of large and/or architecturally significant applications.
Champion company standards and best practices. Work to continuously improve software delivery processes and practices.
Build partnerships across the application, business and infrastructure teams.
Setting up new customer data platforms from Netezza to Databricks
Complete estimates and work plans independently as appropriate for design, development, implementation and rollout tasks.
Communicate with the appropriate teams to ensure that assignments are managed appropriately and that completed assignments are of the highest quality.
Support and maintain applications utilizing required tools and technologies.
May direct the day-to-day work activities of other team members.
Must be able to perform the essential functions of this position with or without reasonable accommodation.
Work quickly with the team to implement new platform.
Be onsite with development team when necessary.
Behaviors/Skills:
Puts the Customer First - Anticipates customer needs, champions for the customer, acts with customers in mind, exceeds customers expectations, gains customers trust and respect.
Communicates effectively and candidly - Communicates clearly and directly, approachable, relates well to others, engages people and helps them understand change, provides and seeks feedback, articulates clearly, actively listens.
Achieves results through teamwork Is open to diverse ideas, works inclusively and collaboratively, holds self and others accountable, involves others to accomplish individual and team goals
Note to Vendors
Length of Contract 9 months
Top skills Databricks, Netezza
Soft Skills Needed collaborating well with others, working in a team dynamic
Project person will be supporting - staying on track with key milestones in Customer Platform / Customer Data Acceleration, Work will be on the new Customer Platform Analytics system in Databricks that will replace Netezza
Team details ie. size, dynamics, locations most of the team is located in Cincinnati, working onsite at the BTD
Work Location (in office, hybrid, remote) Onsite at BTD when necessary, approximately 2-3 days a week
Is travel required - No
Max Rate if applicable best market rate
Required Working Hours 8-5 est
Interview process and when will it start Starting with one interview, process may change
Prescreening Details standard questions. Scores will carry over.
When do you want this person to start Looking to hire quickly, the team is looking to move fast.