ETL/ELT Data Engineer (Secret Clearance) - Hybrid
Austin, TX jobs
LaunchCode is recruiting for a Software Data Engineer to work at one of our partner companies!
Details:
Full-Time W2, Salary
Immediate opening
Hybrid - Austin, TX (onsite 1-2 times a week)
Pay $85K-$120K
Minimum Experience: 4 years
Security Clearance: Active DoD Secret Clearance
Disclaimer: Please note that we are unable to provide work authorization or sponsorship for this role, now or in the future. Candidates requiring current or future sponsorship will not be considered.
Job description
Job Summary
A Washington, DC-based software solutions provider founded in 2017, specializes in delivering mission-critical and enterprise solutions to the federal government. Originating from the Department of Defense's software factory ecosystem, the company focuses on Command and Control, Cybersecurity, Space, Geospatial, and Modeling & Simulation. The company leverages commercial technology to enhance the capabilities of the DoD, IC, and their end-users, with innovation driven by its Innovation centers. The company has a presence in Boston, MA, Colorado Springs, CO, San Antonio, TX, and St. Louis, MO.
Why the company?
Environment of Autonomy
Innovative Commercial Approach
People over process
We are seeking a passionate Software Data Engineer to support the Army Software Factory (ASWF) in aligning with DoDM 8140.03 Cyber Workforce requirements and broader compliance mandates. The Army Software Factory (ASWF), a first-of-its-kind initiative under Army Futures Command, is revolutionizing the Army's approach to software development by training and employing self-sustaining technical talent from across the military and civilian workforce. Guided by the motto “By Soldiers, For Soldiers,” ASWF equips service members to develop mission-critical software solutions independently-especially vital for future contested environments where traditional technical support may be unavailable. This initiative also serves as a strategic prototype to modernize legacy IT processes and build technical readiness across the force to ensure battlefield dominance in the digital age.
Required Skills:
Active DoD Secret Clearance (Required)
4+ years of experience in data science, data engineering, or similar roles.
Expertise in designing, building, and maintaining scalable ETL/ELT pipelines using tools and languages such as Python, SQL, Apache Spark, or Airflow.
Strong proficiency in working with relational and NoSQL databases, including experience with database design, optimization, and query performance tuning (e.g., PostgreSQL, MySQL, MongoDB, Cassandra).
Demonstrable experience with cloud data platforms and services (e.g., AWS Redshift, S3, Glue, Athena; Azure Data Lake, Data Factory, Synapse; Google BigQuery, Cloud Storage, Dataflow).
Solid understanding of data warehousing concepts (e.g., Kimball, Inmon methodologies) and experience with data modeling for analytical purposes.
Proficiency in at least one programming language commonly used in data engineering (e.g., Python, Java, Scala) for data manipulation, scripting, and automation.
CompTIA Security+ Certified or otherwise DoDM 8140.03 (formerly DoD 8570.01-M) compliant.
Nice to Have:
Familiarity with SBIR technologies and transformative platform shifts
Experience working in Agile or DevSecOps environments
2+ years of experience interfacing with Platform Engineers and data visibility team, manage AWS resources, and GitLab admin
#LI-hybrid #austintx #ETLengineer #dataengineer #army #aswf #clearancejobs #clearedjobs #secretclearance #ETL
Data Modeler II
Houston, TX jobs
Job Title: Data Modeler II
Type: W2 Contract (USA)/INC or T4 (Canada)
Work Setup: Hybrid (On-site with flexibility to work from home two days per week)
Industry: Oil & Gas
Benefits: Health, Dental, Vision
Job Summary
We are seeking a Data Modeler II with a product-driven, innovative mindset to design and implement data solutions that deliver measurable business value for Supply Chain operations. This role combines technical expertise with project management responsibilities, requiring collaboration with IT teams to develop solutions for small and medium-sized business challenges. The ideal candidate will have hands-on experience with data transformation, AI integration, and ERP systems, while also being able to communicate technical concepts in clear, business-friendly language.
Key Responsibilities
Develop innovative data solutions leveraging knowledge of Supply Chain processes and oil & gas industry value drivers.
Design and optimize ETL pipelines for scalable, high-performance data processing.
Integrate solutions with enterprise data platforms and visualization tools.
Gather and clean data from ERP systems for analytics and reporting.
Utilize AI tools and prompt engineering to enhance data-driven solutions.
Collaborate with IT and business stakeholders to deliver medium and low-level solutions for local issues.
Oversee project timelines, resources, and stakeholder engagement.
Document project objectives, requirements, and progress updates.
Translate technical language into clear, non-technical terms for business users.
Support continuous improvement and innovation in data engineering and analytics.
Basic / Required Qualifications
Bachelor's degree in Commerce (SCM), Data Science, Engineering, or related field.
Hands-on experience with:
Python for data transformation.
ETL tools (Power Automate, Power Apps; Databricks is a plus).
Oracle Cloud (Supply Chain and Financial modules).
Knowledge of ERP systems (Oracle Cloud required; SAP preferred).
Familiarity with AI integration and low-code development platforms.
Strong understanding of Supply Chain processes; oil & gas experience preferred.
Ability to manage projects and engage stakeholders effectively.
Excellent communication skills for translating technical concepts into business language.
Required Knowledge / Skills / Abilities
Advanced proficiency in data science concepts, including statistical analysis and machine learning.
Experience with prompt engineering and AI-driven solutions.
Ability to clean and transform data for analytics and reporting.
Strong documentation, troubleshooting, and analytical skills.
Business-focused mindset with technical expertise.
Ability to think outside the box and propose innovative solutions.
Special Job Characteristics
Hybrid work schedule (Wednesdays and Fridays remote).
Ability to work independently and oversee own projects.
Senior Data Engineer
Charlotte, NC jobs
**NO 3rd Party vendor candidates or sponsorship**
Role Title: Senior Data Engineer
Client: Global construction and development company
Employment Type: Contract
Duration: 1 year
Preferred Location: Remote based in ET or CT time zones
Role Description:
The Senior Data Engineer will play a pivotal role in designing, architecting, and optimizing cloud-native data integration and Lakehouse solutions on Azure, with a strong emphasis on Microsoft Fabric adoption, PySpark/Spark-based transformations, and orchestrated pipelines. This role will lead end-to-end data engineering-from ingestion through APIs and Azure services to curated Lakehouse/warehouse layers-while ensuring scalable, secure, well-governed, and well-documented data products. The ideal candidate is hands-on in delivery and also brings data architecture knowledge to help shape patterns, standards, and solution designs.
Key Responsibilities
Design and implement end-to-end data pipelines and ELT/ETL workflows using Azure Data Factory (ADF), Synapse, and Microsoft Fabric.
Build and optimize PySpark/Spark transformations for large-scale processing, applying best practices for performance tuning (partitioning, joins, file sizing, incremental loads).
Develop and maintain API-heavy ingestion patterns, including REST/SOAP integrations, authentication/authorization handling, throttling, retries, and robust error handling.
Architect scalable ingestion, transformation, and serving solutions using Azure Data Lake / OneLake, Lakehouse patterns (Bronze/Silver/Gold), and data warehouse modeling practices.
Implement monitoring, logging, alerting, and operational runbooks for production pipelines; support incident triage and root-cause analysis.
Apply governance and security practices across the lifecycle, including access controls, data quality checks, lineage, and compliance requirements.
Write complex SQL, develop data models, and enable downstream consumption through analytics tools and curated datasets.
Drive engineering standards: reusable patterns, code reviews, documentation, source control, and CI/CD practices.
Requirements:
Bachelor's degree (or equivalent experience) in Computer Science, Engineering, or a related field.
5+ years of experience in data engineering with strong focus on Azure Cloud.
Strong experience with Azure Data Factory pipelines, orchestration patterns, parameterization, and production support.
Strong hands-on experience with Synapse (pipelines, SQL pools and/or Spark), and modern cloud data platform patterns.
Advanced PySpark/Spark experience for complex transformations and performance optimization.
Heavy experience with API-based integrations (building ingestion frameworks, handling auth, pagination, retries, rate limits, and resiliency).
Strong knowledge of SQL and data warehousing concepts (dimensional modeling, incremental processing, data quality validation).
Strong understanding of cloud data architectures including Data Lake, Lakehouse, and Data Warehouse patterns.
Preferred Skills
Experience with Microsoft Fabric (Lakehouse/Warehouse/OneLake, Pipelines, Dataflows Gen2, notebooks).
Architecture experience (formal or informal), such as contributing to solution designs, reference architectures, integration standards, and platform governance.
Experience with DevOps/CI-CD for data engineering using Azure DevOps or GitHub (deployment patterns, code promotion, testing).
Experience with Power BI and semantic model considerations for Lakehouse/warehouse-backed reporting.
Familiarity with data catalog/governance tooling (e.g., Microsoft Purview).
Senior Data Engineer
Nashville, TN jobs
Concert is a software and managed services company that promotes health by providing the digital infrastructure for reliable and efficient management of laboratory testing and precision medicine. We are wholeheartedly dedicated to enhancing the transparency and efficiency of health care. Our customers include health plans, provider systems, laboratories, and other important stakeholders. We are a growing organization driven by smart, creative people to help advance precision medicine and health care. Learn more about us at ***************
YOUR ROLE
Concert is seeking a skilled Senior Data Engineer to join our team. Your role will be pivotal in designing, developing, and maintaining our data infrastructure and pipelines, ensuring robust, scalable, and efficient data solutions. You will work closely with data scientists, analysts, and other engineers to support our mission of automating the application of clinical policy and payment through data-driven insights.
You will be joining an innovative, energetic, passionate team who will help you grow and build skills at the intersection of diagnostics, information technology and evidence-based clinical care.
As a Senior Data Engineer you will:
Design, develop, and maintain scalable and efficient data pipelines using AWS services such as Redshift, S3, Lambda, ECS, Step Functions, and Kinesis Data Streams.
Implement and manage data warehousing solutions, primarily with Redshift, and optimize existing data models for performance and scalability.
Utilize DBT (data build tool) for data transformation and modeling, ensuring data quality and consistency.
Develop and maintain ETL/ELT processes to ingest, process, and store large datasets from various sources.
Work with SageMaker for machine learning data preparation and integration.
Ensure data security, privacy, and compliance with industry regulations.
Collaborate with data scientists and analysts to understand data requirements and deliver solutions that meet their needs.
Monitor and troubleshoot data pipelines, identifying and resolving issues promptly.
Implement best practices for data engineering, including code reviews, testing, and automation.
Mentor junior data engineers and share knowledge on data engineering best practices.
Stay up-to-date with the latest advancements in data engineering, AWS services, and related technologies.
After 3 months on the job you will have:
Developed a strong understanding of Concert's data engineering infrastructure
Learned the business domain and how it maps to the information architecture
Made material contributions towards existing key results
After 6 months you will have:
Led a major initiative
Become the first point of contact when issues related to the data warehouse are identified
After 12 months you will have:
Taken responsibility for the long term direction of the data engineering infrastructure
Proposed and executed key results with an understanding of the business strategy
Communicated the business value of major technical initiatives to key non-technical business stakeholders
WHAT LEADS TO SUCCESS
Self-Motivated A team player with a positive attitude and a proactive approach to problem-solving.
Executes Well You are biased to action and get things done. You acknowledge unknowns and recover from setbacks well.
Comfort with Ambiguity You aren't afraid of uncertainty and blazing new trails, you care about building towards a future that is different from today.
Technical Bravery You are comfortable with new technologies and eager to dive in to understand data in the raw and in its processed states.
Mission-focused You are personally motivated to drive more affordable, equitable and effective integration of genomic technologies into clinical care.
Effective Communication You build rapport and great working relationships with senior leaders, peers, and use the relationships you've built to drive the company forward
RELEVANT SKILLS & EXPERIENCE
Minimum of 4 years experience working as a data engineer
Bachelor's degree in software or data engineering or comparable technical certification / experience
Ability to effectively communicate complex technical concepts to both technical and non-technical audiences.
Proven experience in designing and implementing data solutions on AWS, including Redshift, S3, Lambda, ECS, and Step Functions
Strong understanding of data warehousing principles and best practices
Experience with DBT for data transformation and modeling.
Proficiency in SQL and at least one programming language (e.g., Python, Scala)
Familiarity or experience with the following tools / concepts are a plus: BI tools such as Metabase; Healthcare claims data, security requirements, and HIPAA compliance; Kimball's dimensional modeling techniques; ZeroETL and Kinesis data streams
COMPENSATION
Concert is seeking top talent and offers competitive compensation based on skills and experience. Compensation will commensurate with experience. This position will report to the VP of Engineering.
LOCATION
Concert is based in Nashville, Tennessee and supports a remote work environment.
For further questions, please contact: ******************.
Junior Data Engineer
Columbus, OH jobs
Contract-to-Hire
Columbus, OH (Hybrid)
Our healthcare services client is looking for an entry-level Data Engineer to join their team. You will play a pivotal role in maintaining and improving inventory and logistics management programs. Your day-to-day work will include leveraging machine learning and open-source technologies to drive improvements in data processes.
Job Responsibilities
Automate key processes and enhance data quality
Improve injection processes and enhance machine learning capabilities
Manage substitutions and allocations to streamline product ordering
Work on logistics-related data engineering tasks
Build and maintain ML models for predictive analytics
Interface with various customer systems
Collaborate on integrating AI models into customer service
Qualifications
Bachelor's degree in related field
0-2 years of relevant experience
Proficiency in SQL and Python
Understanding of GCP/BigQuery (or any cloud experience, basic certifications a plus).
Knowledge of data science concepts.
Business acumen and understanding (corporate experience or internship preferred).
Familiarity with Tableau
Strong analytical skills
Attitude for collaboration and knowledge sharing
Ability to present confidently in front of leaders
Why Should You Apply?
You will be part of custom technical training and professional development through our Elevate Program!
Start your career with a Fortune 15 company!
Access to cutting-edge technologies
Opportunity for career growth
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
Data Architect
Cincinnati, OH jobs
THIS IS A W2 (NOT C2C OR REFERRAL BASED) CONTRACT OPPORTUNITY
REMOTE MOSTLY WITH 1 DAY/MO ONSITE IN CINCINNATI-LOCAL CANDIDATES TAKE PREFERENCE
RATE: $75-85/HR WITH BENEFITS
We are seeking a highly skilled Data Architect to function in a consulting capacity to analyze, redesign, and optimize a Medical Payments client's environment. The ideal candidate will have deep expertise in SQL, Azure cloud services, and modern data architecture principles.
Responsibilities
Design and maintain scalable, secure, and high-performing data architectures.
Lead migration and modernization projects in heavy use production systems.
Develop and optimize data models, schemas, and integration strategies.
Implement data governance, security, and compliance standards.
Collaborate with business stakeholders to translate requirements into technical solutions.
Ensure data quality, consistency, and accessibility across systems.
Required Qualifications
Bachelor's degree in Computer Science, Information Systems, or related field.
Proven experience as a Data Architect or similar role.
Strong proficiency in SQL (query optimization, stored procedures, indexing).
Hands-on experience with Azure cloud services for data management and analytics.
Knowledge of data modeling, ETL processes, and data warehousing concepts.
Familiarity with security best practices and compliance frameworks.
Preferred Skills
Understanding of Electronic Health Records systems.
Understanding of Big Data technologies and modern data platforms outside the scope of this project.
Senior Business Data Architect
Cincinnati, OH jobs
Job Title: Senior Business Data Architect
Who we are:
Vernovis is a Total Talent Solutions company that specializes in Technology, Cybersecurity, Finance & Accounting functions. At Vernovis, we help these professionals achieve their career goals, matching them with innovative projects and dynamic direct hire opportunities in Ohio and across the Midwest.
Client Overview:
Vernovis is partnering with a fast-paced manufacturing company that is looking to hire a Sr. Data Manager. This is a great opportunity for an experienced Snowflake professional to elevate their career leading a team and designing the architecture of the data warehouse.
If interested, please email Wendy Kolkmeyer at ***********************
What You'll Do:
Architect and optimize the enterprise data warehouse using Snowflake.
Develop and maintain automated data pipelines with Fivetran or similar ETL tools.
Design and enhance DBT data models to support analytics, reporting, and operational decision-making.
Oversee and improve Power BI reporting, ensuring data is accurate, accessible, and actionable for business users.
Establish and enforce enterprise data governance, standards, policies, and best practices.
Collaborate with business leaders to translate requirements into scalable, high-quality data solutions.
Enable advanced analytics and AI/ML initiatives through proper data structuring and readiness.
Drive cross-functional alignment, communication, and stakeholder engagement.
Lead, mentor, and develop members of the data team.
Ensure compliance, conduct system audits, and maintain business continuity plans.
What Experience You'll Have:
7+ years of experience in data architecture, data engineering, or enterprise data management.
Expertise in Snowflake, Fivetran (or similar ETL tools), DBT, and Power BI.
Strong proficiency in SQL and modern data architecture principles.
Proven track record in data governance, modeling, and data quality frameworks.
Demonstrated experience leading teams and managing complex data initiatives.
Ability to communicate technical concepts clearly and collaborate effectively with business stakeholders.
What is Nice to Have:
Manufacturing experience
Vernovis does not accept inquiries from Corp to Corp recruiting companies. Applicants must be currently authorized to work in the United States on a full-time basis and not violate any immigration or discrimination laws.
Vernovis provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.
This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
DevOps Engineer | Machine Learning Platforms
Pittsburgh, PA jobs
MLOps Engineer (Remote | Pittsburgh, PA area)
On-site: 1 day/month
We are seeking a highly skilled MLOps Engineer to support the end-to-end deployment, monitoring, and optimization of our machine learning models. In this role, you will serve as the critical link between Data Science and Operations, ensuring that models are scalable, reliable, and production-ready.
This position is fully remote, but candidates must reside in the Pittsburgh area and be available for monthly on-site meetings.
About eNGINE
eNGINE builds Technical Teams. We are a Solutions and Placement firm shaped by decades of interaction with Technical professionals. Our inspiration is continuous learning and engagement with the markets we serve, the talent we represent, and the teams we build. Our Consulting Workforce is encouraged to enjoy career fulfillment in the form of challenging projects, schedule flexibility, and paid training/certifications. Successful outcomes start and finish with eNGINE
Key Responsibilities
Pipeline Development: Design, build, and maintain CI/CD pipelines supporting the full machine learning lifecycle, from training to deployment.
Infrastructure Management: Orchestrate and maintain containerized environments using Docker and Kubernetes; manage cloud resources for scalable and efficient inference.
Model Monitoring: Build systems to monitor model performance, detect data drift, ensure uptime, and maintain compliance with reliability standards.
Automation: Automate training, testing, deployment, and retraining processes to reduce manual steps and increase operational efficiency.
Collaboration: Partner with Data Scientists, Software Engineers, and Product teams to integrate ML into production systems and support ongoing enhancements.
Optimization: Continuously evaluate model pipelines and infrastructure for improvements in cost, performance, and scalability.
Technical Requirements
Programming: Expert-level Python, including NumPy, Pandas, scikit-learn, and at least one major deep learning framework (PyTorch or TensorFlow).
Infrastructure: Strong hands-on experience with Docker, Kubernetes, and IaC tools such as Terraform or CloudFormation.
MLOps Tooling: Familiarity with MLflow, Kubeflow, or similar model management platforms.
Cloud Platforms: Practical experience working with AWS, GCP, or Azure ML services.
Best Practices: Solid understanding of version control, automated testing, documentation, and reproducible ML workflows.
Qualifications
Bachelor's or Master's degree in Computer Science, Machine Learning, Data Science, or a related technical field.
Proven experience deploying machine learning models to production environments-not just experimentation.
Prior experience supporting or building ML-driven digital products strongly preferred.
Digital product / platform experience
Demonstrated ability to work effectively across cross-functional engineering and data teams.
Strong problem-solving abilities, attention to detail, and a passion for building stable, scalable ML systems.
Next Steps
No C2C, relocation, or sponsorship for this role
For finer details on how eNGINE can impact your career, apply today!
Senior Software Engineer
Charlotte, NC jobs
Senior Software Engineer (Full Stack)
Jacksonville, FL
We are seeking a highly skilled and motivated Senior Software Engineer to join a fast-paced, agile development team. In this fully remote role, you will leverage your full-stack expertise to design, develop, and deliver cutting-edge software solutions using C#, Angular, SQL, and Azure. You will also play a key role in mentoring team members, contributing to the technical growth of the team.
Responsibilities
Design, develop, and maintain robust, scalable, and secure full-stack applications.
Collaborate closely with cross-functional teams to define, plan, and deliver high-quality features.
Write clean, efficient, and maintainable code that adheres to industry best practices.
Optimize and troubleshoot applications to ensure peak performance and reliability.
Utilize Azure services to build and deploy cloud-native solutions.
Design and maintain databases using SQL, ensuring data integrity and optimal performance.
Lead code reviews and provide mentorship to junior developers, fostering a culture of continuous improvement.
Actively participate in sprint planning, retrospectives, and other Agile ceremonies.
Stay current with emerging technologies and contribute to technical decision-making.
Qualifications
5+ years of professional experience in full-stack development.
Proficiency in C#, Angular, SQL, and Azure.
Strong understanding of object-oriented programming and modern design patterns.
Experience building RESTful APIs and integrating third-party services.
Familiarity with Agile development methodologies.
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills, with the ability to mentor and guide others.
Preferred Skills
Experience with DevOps practices, CI/CD pipelines, and infrastructure-as-code.
Knowledge of microservices architecture and containerization (e.g., Docker, Kubernetes).
Understanding of security best practices in web and cloud development.
Software Engineer III[80606]
New York, NY jobs
Onward Search is partnering with a leading tech client to hire a Software Engineer III to help build the next generation of developer infrastructure and tooling. If you're passionate about making developer workflows faster, smarter, and more scalable, this is the role for you!
Location: 100% Remote (EST & CST Preferred)
Contract Duration: 6 months
What You'll Do:
Own and maintain Bazel build systems and related tooling
Scale monorepos to millions of lines of code
Collaborate with infrastructure teams to define best-in-class developer workflows
Develop and maintain tools for large-scale codebases
Solve complex problems and improve developer productivity
What You'll Need:
Experience with Bazel build system and ecosystem (e.g., rules_jvm_external, IntelliJ Bazel plugin)
Fluency in Java, Python, Starlark, and TypeScript
Strong problem-solving and collaboration skills
Passion for building highly productive developer environments
Perks & Benefits:
Medical, Dental, and Vision Insurance
Life Insurance
401k Program
Commuter Benefits
eLearning & Education Reimbursement
Ongoing Training & Development
This is a fully remote, contract opportunity for a motivated engineer who loves working in a flow-focused environment and improving developer experiences at scale.
Software Engineer
Columbus, OH jobs
Software Engineer - Internal Product Team
Division: Impower Solutions (Agility Partners)
About Impower
Impower is the technology consulting division of Agility Partners, specializing in automation & AI, data engineering & analytics, software engineering, and digital transformation. We deliver high-impact solutions with a focus on innovation, efficiency, and client satisfaction.
Role Overview
We're building a high-performing internal product team to scale our proprietary tech stack. As a Software Engineer, you'll contribute to the development of internal platforms using modern technologies. You'll collaborate with product and engineering peers to deliver scalable, maintainable solutions that drive Impower's consulting capabilities.
Key Responsibilities
Development & Implementation
Build scalable APIs using TypeScript and Bun for high-performance backend services.
Develop intelligent workflows and AI agents leveraging Temporal, enabling robust orchestration and automation.
Move and transform data using Python and DBT, supporting analytics and operational pipelines.
Contribute to full-stack development of internal websites using Next.js (frontend), Elysia (API layer), and Azure SQL Server (database).
Implement CI/CD pipelines using GitHub Actions, with a focus on automated testing, secure deployments, and environment consistency.
Deploy and manage solutions in Azure, including provisioning and maintaining infrastructure components such as App Services, Azure Functions, Storage Accounts, and SQL databases.
Monitor and troubleshoot production systems using SigNoz, ensuring observability across services with metrics, traces, and logs to maintain performance and reliability.
Write clean, testable code and contribute to unit, integration, and end-to-end test suites.
Collaborate in code reviews, sprint planning, and backlog grooming to ensure alignment and quality across the team.
Innovation & Strategy
Stay current with emerging technologies and frameworks, especially in the areas of agentic AI, orchestration, and scalable infrastructure.
Propose improvements to internal platforms based on performance metrics, developer experience, and business needs.
Contribute to technical discussions around design patterns, tooling, and long-term platform evolution.
Help evaluate open-source tools and third-party services that could accelerate development or improve reliability.
Delivery & Collaboration
Participate in agile ceremonies including sprint planning, standups, and retrospectives.
Collaborate closely with product managers, designers, and other engineers to translate requirements into working solutions.
Communicate progress, blockers, and technical decisions clearly and proactively.
Take ownership of assigned features and enhancements from ideation through deployment and support.
Leadership
Demonstrate ownership and accountability in your work, contributing to a culture of reliability and continuous improvement.
Share knowledge through documentation, pairing, and informal mentoring of junior team members.
Engage in code reviews to uphold quality standards and foster team learning.
Actively participate in team discussions and help shape a collaborative, inclusive engineering culture.
Qualifications
2-4 years of experience in software engineering, ideally in a product-focused or platform engineering environment.
Proficiency in TypeScript and Python, with hands-on experience in full-stack development.
Experience building APIs and backend services using Bun, Elysia, or similar high-performance frameworks (e.g., Fastify, Express, Flask).
Familiarity with Next.js for frontend development and Azure SQL Server for relational data storage.
Experience with workflow orchestration tools such as Temporal, Airflow, or Prefect, especially for building intelligent agents or automation pipelines.
Proficiency in data transformation using DBT, with a solid understanding of analytics engineering principles.
Strong understanding of CI/CD pipelines using GitHub Actions, including automated testing, environment management, and secure deployments.
Exposure to observability platforms such as SigNoz, Grafana, Prometheus, or OpenTelemetry, with a focus on metrics, tracing, and log aggregation.
Solid grasp of software testing practices and version control (Git).
Excellent communication skills, a collaborative mindset, and a willingness to learn and grow within a team.
Why Join Us?
Build impactful internal products that shape the future of Impower's consulting capabilities.
Work with cutting-edge technologies in a collaborative, innovation-driven environment.
Enjoy autonomy, growth opportunities, and a culture that values excellence and people.
Senior Software Engineer
Columbus, OH jobs
Job Title: Spark 3 Developer
Who We Are:
Vernovis is a Total Talent Solutions company specializing in Technology, Cybersecurity, Finance & Accounting functions. At Vernovis, we help professionals achieve their career goals by matching them with innovative projects and dynamic contract opportunities across Ohio and the Midwest.
Client Overview:
Vernovis is partnering with a leading organization in scientific data management and innovation to modernize its big data platform. This initiative involves transitioning legacy systems, such as Cascading, Hadoop, and MapReduce, to Spark 3, optimizing for scalability and efficiency. As part of this well-established organization, your work will contribute to transforming how big data environments are managed and processed.
What You'll Do:
Legacy Workflow Migration: Lead the conversion of existing Cascading, Hadoop, and MapReduce workflows to Spark 3, ensuring seamless transitions.
Performance Optimization: Utilize Spark 3 features like Adaptive Query Execution (AQE) and Dynamic Partition Pruning to optimize data pipelines.
Collaboration: Work closely with infrastructure teams and stakeholders to ensure alignment with modernization initiatives.
Big Data Ecosystem Integration: Develop solutions that integrate with platforms like Hadoop, Hive, Kafka, and cloud environments (AWS, Azure).
Support Modernization Goals: Contribute to key organizational initiatives focused on next-generation data optimization and modernization.
What Experience You'll Have:
Spark 3 Expertise: 3+ years of experience with Apache Spark, including Spark 3.x development and optimization.
Migration Experience: Proven experience transitioning from Cascading, Hadoop, or MapReduce to Spark 3.
Programming Skills: Proficiency in Scala, Python, or Java.
Big Data Ecosystem: Strong knowledge of Hadoop, Hive, and Kafka.
Performance Tuning: Advanced skills in profiling, troubleshooting, and optimizing Spark jobs.
Cloud Platforms: Familiarity with AWS (EMR, Glue, S3) or Azure (Databricks, Data Lake).
The Vernovis Difference:
Vernovis offers Health, Dental, Vision, Voluntary Short & Long -Term Disability, Voluntary Life Insurance, and 401K.
Vernovis does not accept inquiries from Corp to Corp recruiting companies. Applicants must be currently authorized to work in the United States on a full-time basis and not violate any immigration or discrimination laws.
Vernovis provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.
This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
Sr. Full Stack Developer
Boston, MA jobs
Senior Developer (Full Stack)
100% Remote
6-month contract (potential for extension)
As the Senior Developer (Full Stack), you will be responsible for modernizing legacy applications and developing cloud-native solutions for the Executive Office of Education (EOE). You will design and maintain both front-end and back-end components using Node.js, Angular, and TypeScript, while supporting older Java and .NET systems during their transition. This role involves collaborating with cross-functional teams to analyze existing systems, build scalable APIs, and implement secure, high-performing applications in an AWS environment. You will also mentor junior developers and ensure best practices in architecture, testing, and documentation.
Minimum Qualifications:
Strong experience in TypeScript, JavaScript, HTML, and CSS
Proficiency with Angular for front-end development and Node.js/Express.js for back-end services
Experience with Java and/or .NET for maintaining and refactoring legacy systems
Familiarity with databases such as Postgres, Snowflake, Oracle, and SQL Server
Knowledge of AWS services and cloud-native development
Nice to Have:
Exposure to CI/CD pipelines and DevOps tools (e.g., GitHub Actions, Jenkins)
Experience with ORM tools like Sequelize or Hibernate
Responsibilities:
Design, develop, and maintain full-stack web applications using Node.js and Angular
Assess and refactor legacy applications into modern architectures
Build RESTful APIs and integrate with internal/external services
Collaborate with teams and mentor junior developers on modern frameworks
Write unit/integration tests and perform code reviews
What's In It For You:
Weekly Paychecks
Opportunity to lead modernization initiatives in a fully AWS-implemented environment
Collaborative team culture with cutting-edge technologies
Senior Data Engineer
New York, NY jobs
Job Description
Senior Data Engineer $150k - $170k
We are a leading cloud-based mobile patient intake and registration system. Our platform allows patients to complete their paperwork from the comfort of their own homes using their smartphones or computers, minimizing in-person contact and streamlining the check-in process. With our fully customizable patient scheduling, intake, and payment platform, you can maximize efficiency and minimize waiting room activity. Our patient engagement solution also syncs seamlessly with your EMR system to keep records updated in real-time.
Role Description
This is a full-time position for a Senior Data Engineer. As a Senior Data Engineer, you will be responsible for the day-to-day tasks associated with data engineering, including data modeling, ETL (Extract Transform Load), data warehousing, and data analytics. This is a hybrid role, with the majority of work located in the New York City office but with flexibility for some remote work.
Qualifications
Data Engineering, Data Modeling, and ETL (Extract Transform Load) skills
Data Warehousing and Data Analytics skills
Experience with cloud-based data solutions
Strong problem-solving and analytical skills
Proficiency in programming languages such as Python, PySpark or Java
Experience with SQL and database management systems
Knowledge of healthcare data requirements and regulations is a plus
Bachelor's degree in Computer Science, Engineering, or related field
If interested, please send your resume to: rick@ingenium.agency
Senior Cloud Data Engineer
Remote
National Debt Relief (NDR) is seeking an experienced Senior Cloud Data Engineer to join our Data Engineering team. In this role, you will own the orchestration, automation, and optimization of data workflows across our existing Snowflake-based enterprise data platform. Reporting to the Director of Data Engineering, you will be responsible for designing and delivering scalable, production-ready data systems by extending and managing the infrastructure, pipelines, and monitoring frameworks that power our analytics ecosystem.
The ideal candidate has hands-on experience with Python-based data pipelines, deep experience with modern orchestration tools like Dagster, and proficiency with SQL and modern cloud data warehouses. You will operate at the intersection of data engineering and platform operations, ensuring reliable, automated, and governed data workflows at scale. This role requires a high degree of ownership, strong problem-solving skills, and clear communication with leadership and technical teams alike.
Responsibilities
Contribute to the design and orchestration of data pipelines in Dagster to improve ingestion, transformation, and data quality workflows across the enterprise data platform.
Develop and maintain Python-based ingestion pipelines where there isn't already a connector from Fivetran, integrating data from APIs and third-party systems.
Manage Snowflake infrastructure using IaC (Terraform or similar), while adhering to Data Engineering best practices.
Design, maintain, and optimize dbt transformation workflows to support curated and trusted data models for analytics and operations.
Focus on optimizing Snowflake performance and reducing compute spend through warehouse tuning, efficient query design, and resource utilization monitoring.
Respond to morning load failures to minimize impact to the business (east coast working hours).
Implement robust data security and access controls within Snowflake, ensuring compliance with governance and privacy standards.
Develop and maintain CI/CD workflows for data pipelines, including automated testing, deployment, and version control practices.
Implement observability frameworks for data pipelines, including freshness checks, data contract enforcement, and automated alerting for anomalies.
Document system architectures, workflows, and configurations to support governance, reproducibility, and transparency.
Drive consistent, visible deliverables that demonstrate progress and impact, ensuring projects remain on track.
Qualifications
Education/Experience
Bachelor's degree required. Bachelor's degree in computer science, Data Engineering, a related field or advanced degree preferred.
7 years of experience in data engineering or data warehouse development, with a focus on cloud environments.
Required Skills/Abilities
Expertise in SQL and dbt for building and maintaining curated datasets and data transformation pipelines.
Hands-on expertise with Snowflake, including experience managing infrastructure with IaC tools (Terraform or equivalent).
Demonstrated experience with Dagster (or Airflow, with a strong desire to work in Dagster) for managing event-driven pipelines and orchestrated assets.
Proficiency in Python for developing pipelines, APIs, and automation solutions.
Proven track record of implementing CI/CD workflows and automated testing for data pipelines.
Experience designing and implementing observability frameworks for data freshness, quality, and reliability.
Proactive ownership mindset with the ability to work independently and deliver results with minimal oversight.
Clear, timely, and proactive communication, including experience collaborating with leadership stakeholders.
Ability to manage multiple priorities and projects, ensuring progress stays visible and deliverables are met.
Strong troubleshooting and problem-solving skills, with attention to detail when working with sensitive systems and processes.
Strong collaboration and communication skills to partner effectively across data engineering, analytics, and product teams.
Self-starter with the ability to define and establish data orchestration standards in a growing environment.
Preferred:
Experience in financial services or related industries.
Expertise deploying and maintaining orchestration systems at scale (Dagster, Airflow).
National Debt Relief Role Qualifications:
Computer competency and ability to work with a computer.
Prioritize multiple tasks and projects simultaneously.
Exceptional written and verbal communication skills.
Punctuality expected, ready to report to work on a consistent basis.
Attain and maintain high performance expectations on a monthly basis.
Work in a fast-paced, high-volume setting.
Use and navigate multiple computer systems with exceptional multi-tasking skills.
Remain calm and professional during difficult discussions.
Take constructive feedback.
Compensation Information Our salary ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for each position across the US. Within the range, individual pay is determined by work location, job-related skills, experience, and relevant education or training. This good faith pay range is provided in compliance with NYC law and the laws of other jurisdictions that may require a salary range in job postings. The salary for this position is $141,500 - 162,500. About National Debt Relief
National Debt Relief was founded in 2009 with the goal of helping an expanding number of consumers deal with overwhelming debt. We are one of the most-trusted and best-rated consumer debt relief providers in the United States. As a leading debt settlement organization, we have helped over 450,000 people settle over $10 billion of debt, while empowering them to lead a healthier financial lifestyle and feel free to live their best life. At National Debt Relief, we treat our clients like real people. Our purpose is to elevate, empower, and transform their lives.
Rated A+ by the Better Business Bureau, our goal is to help individuals and families get out of debt with the least possible cost through conducting financial consultations, educating the consumer and recommending the appropriate solution. We become our clients' number one advocate to help them reestablish financial stability as quickly as possible.
Benefits
National Debt Relief is a team-oriented environment full of rewards and growth opportunities for our employees. We are dedicated to our employee's success and growth within the company, through our employee mentorship and leadership programs.
Our extensive benefits package includes:
Generous Medical, Dental, and Vision Benefits
401(k) with Company Match
Paid Holidays, Volunteer Time Off, Sick Days, and Vacation
12 weeks Paid Parental Leave
Pre-tax Transit Benefits
No-Cost Life Insurance Benefits
Voluntary Benefits Options
ASPCA Pet Health Insurance Discount
Access to your earned wages at any time before payday
National Debt Relief is a certified Great Place to Work !
National Debt Relief is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other status protected by law.
For information about our Employee Privacy Policy, please see here
For information about our Applicant Terms, please see here
#LI-REMOTE
Auto-Apply(P) Data Analytics Support - REMOTE
Orlando, FL jobs
JOB TITLE: (P) Data Analytics Support - REMOTE PAY RATE: Up to $43/hour
We are a national aerospace and defense staffing agency seeking highly qualified candidates for a position with a top-tier client.
Job Details:
Job Type: Contract (12 months with potential for extension)
Industry: Aerospace / Defense / Aviation
Benefits: Medical, dental, and vision (Cigna)
Perks: Bonus potential + Priority access via Tier 1 supplier
Openings Nationwide: Thousands of opportunities across the U.S.
Qualifying Questions:
Are you a U.S. person as defined under ITAR regulations?
Do you meet the educational and experience requirements for this role?
Can you commute to the job location or relocate if necessary?
Summary:
Create clear, concise, and engaging communications to inform stakeholders on data ecosystem deployments
Develop visually engaging technical training materials and user resources
Engage stakeholders to gather feedback, address concerns, and ensure smooth implementation
Build project roadmaps and deployment plans
Support development of adoption metrics to track success
Requirements:
Strong written and verbal communication skills across multiple platforms (email, Teams, etc.)
Ability to develop visually compelling materials independently or in collaboration with SMEs and leadership
Proven stakeholder management and project tracking experience
Familiarity with data and analytics organizations, including 1LMX tools and data governance
Proficiency in Microsoft Office, Jira, Confluence, and ability to learn new tools quickly
Comfortable working in a fast-paced, evolving environment
Experience with Tableau or other data visualization tools
Familiarity with process automation
Must be a U.S. Person (as defined by ITAR).
About Us:
The Structures Company is a premier national aerospace and defense staffing agency specializing in contract, contract-to-hire, and direct hire placements. We deliver expert workforce solutions across engineering, IT, production, maintenance, and support roles.
As trusted partners to major aerospace OEMs and Tier 1 suppliers, we connect professionals with opportunities to grow and excel in the aviation and aerospace industries.
Eligibility Requirements:
Must be a U.S. Citizen, lawful permanent resident, or protected individual under 8 U.S.C. 1324b(a)(3) to comply with ITAR regulations.
Keywords: aerospace, aviation, engineering, maintenance, aircraft design, defense
Take your career to new heights-apply today!
Data Scientist (or AI Engineer) - Hybrid
Tampa, FL jobs
Job Title: Data Scientist (or AI Engineer) Workplace: Hybrid; 2-3 days per week onsite at MacDill AFB, FL Clearance: Top Secret (TS) Elder Research is seeking mid to senior-level Data Scientists (AI Engineers) to support a U.S. national security client at MacDill AFB in Tampa, FL. In this mission-focused role, you will apply advanced data science and AI/ML techniques to enable intelligence analysts to uncover hidden patterns, enhance decision-making, and drive intelligence innovation in support of national security.
This hybrid role offers the opportunity to work at the cutting edge of analytics and defense, directly impacting military operations across the U.S. Intelligence and Defense community. Our team integrates expertise in data science, AI/ML, and intelligence operations to deliver data-driven solutions for the U.S. National Security enterprise. The work directly contributes to decision-making, mission readiness, and the ability of operators to succeed in a complex global battlespace.
Key Responsibilities:
As a Data Scientist on this program, you will:
* Lead and conduct multifaceted analytic studies on large, diverse data sets.
* Develop and deploy AI/ML models to enrich data and provide utility to intelligence analysts.
* Perform complex data assessments to determine the operational relevance of proposed data sets for answering priority intelligence requirements.
* Build and maintain data pipelines to ingest, transform, and structure both structured and unstructured data.
* Discover links, patterns, and connections in disparate datasets, providing analysts with actionable context.
* Experiment with exploratory mathematical, statistical, and computational techniques to identify new insights.
* Provide solutions to command-level data challenges through rigorous analysis and innovative applications.
* Support the development of tailored data environments and tools for intelligence analysts.
Requirements:
* Education: Bachelors degree in a technical field (e.g., Engineering, Mathematics, Statistics, Physics, Computer Science, IT, or related discipline).
* Years of Experience:
* 3-6 years; mid-level data scientist
* 6+ years; senior-level data scientist
* Clearance: active Top Secret (TS)
* AI/ML Expertise:
* Demonstrated experience applying Artificial Intelligence (AI) or Machine Learning (ML) to real-world problems.
* Hands-on experience training, fine-tuning, and configuring AI/ML models and deploying them into production environments.
* Proficiency in at least one AI/ML branch (e.g., Natural Language Processing, computer vision, generative AI, agentic AI).
* Programming & Tools:
* Strong programming skills in Python, R, Java, Rust, or similar languages.
* Experience with AI/ML Ops, SQL/no SQL databases, and frameworks such as SQLAlchemy, Flask, Streamlit, Dash, React, and Spark.
* Familiarity with APIs, CI/CD pipelines, and web technologies (JavaScript, HTML/CSS).
* Analytical & Research Skills:
* Ability to interpret and analyze structured and unstructured data using exploratory mathematical and statistical techniques.
* Experience cleaning, transforming, and organizing data to support advanced analytics.
* Ability to experiment with datasets, derive insights, and provide innovative solutions to complex mission challenges.
* Collaboration & Communication:
* Ability to work independently as well as within cross-functional teams.
* Strong communication, problem-solving, and critical-thinking skills.
* Capable of coordinating research and analytic activities with diverse stakeholders.
Preferred Qualifications:
* Active TS/SCI clearance.
* Experience supporting the intelligence domain, particularly Intelligence, Surveillance, and Reconnaissance (ISR).
* Previous work supporting Special Operations Forces (SOF) missions or U.S. national security customers.
* Demonstrated expertise across multiple AI/ML disciplines (e.g., NLP, computer vision, generative AI, agentic AI).
Why apply to this position at Elder Research?
* Competitive Salary and Benefits
* Important Work / Make a Difference supporting U.S. national security.
* Job Stability: Elder Research is not a typical government contractor, we hire you for a career not just a contract.
* People-Focused Culture: we prioritize work-life-balance and provide a supportive, positive, and collaborative work environment as well as opportunities for professional growth and advancement.
* Company Stock Ownership: all employees are provided with shares of the company each year based on company value and profits.
Data Scientist (or AI Engineer) - Hybrid
Tampa, FL jobs
Job Title: Data Scientist (or AI Engineer)
Workplace: Hybrid; 2-3 days per week onsite at MacDill AFB, FL
Clearance: Top Secret (TS)
Elder Research is seeking mid to senior-level Data Scientists (AI Engineers) to support a U.S. national security client at MacDill AFB in Tampa, FL. In this mission-focused role, you will apply advanced data science and AI/ML techniques to enable intelligence analysts to uncover hidden patterns, enhance decision-making, and drive intelligence innovation in support of national security.
This hybrid role offers the opportunity to work at the cutting edge of analytics and defense, directly impacting military operations across the U.S. Intelligence and Defense community. Our team integrates expertise in data science, AI/ML, and intelligence operations to deliver data-driven solutions for the U.S. National Security enterprise. The work directly contributes to decision-making, mission readiness, and the ability of operators to succeed in a complex global battlespace.
Key Responsibilities:
As a Data Scientist on this program, you will:
Lead and conduct multifaceted analytic studies on large, diverse data sets.
Develop and deploy AI/ML models to enrich data and provide utility to intelligence analysts.
Perform complex data assessments to determine the operational relevance of proposed data sets for answering priority intelligence requirements.
Build and maintain data pipelines to ingest, transform, and structure both structured and unstructured data.
Discover links, patterns, and connections in disparate datasets, providing analysts with actionable context.
Experiment with exploratory mathematical, statistical, and computational techniques to identify new insights.
Provide solutions to command-level data challenges through rigorous analysis and innovative applications.
Support the development of tailored data environments and tools for intelligence analysts.
Requirements:
Education: Bachelor s degree in a technical field (e.g., Engineering, Mathematics, Statistics, Physics, Computer Science, IT, or related discipline).
Years of Experience:
3-6 years; mid-level data scientist
6+ years; senior-level data scientist
Clearance: active Top Secret (TS)
AI/ML Expertise:
Demonstrated experience applying Artificial Intelligence (AI) or Machine Learning (ML) to real-world problems.
Hands-on experience training, fine-tuning, and configuring AI/ML models and deploying them into production environments.
Proficiency in at least one AI/ML branch (e.g., Natural Language Processing, computer vision, generative AI, agentic AI).
Programming & Tools:
Strong programming skills in Python, R, Java, Rust, or similar languages.
Experience with AI/ML Ops, SQL/no SQL databases, and frameworks such as SQLAlchemy, Flask, Streamlit, Dash, React, and Spark.
Familiarity with APIs, CI/CD pipelines, and web technologies (JavaScript, HTML/CSS).
Analytical & Research Skills:
Ability to interpret and analyze structured and unstructured data using exploratory mathematical and statistical techniques.
Experience cleaning, transforming, and organizing data to support advanced analytics.
Ability to experiment with datasets, derive insights, and provide innovative solutions to complex mission challenges.
Collaboration & Communication:
Ability to work independently as well as within cross-functional teams.
Strong communication, problem-solving, and critical-thinking skills.
Capable of coordinating research and analytic activities with diverse stakeholders.
Preferred Qualifications:
Active TS/SCI clearance.
Experience supporting the intelligence domain, particularly Intelligence, Surveillance, and Reconnaissance (ISR).
Previous work supporting Special Operations Forces (SOF) missions or U.S. national security customers.
Demonstrated expertise across multiple AI/ML disciplines (e.g., NLP, computer vision, generative AI, agentic AI).
Why apply to this position at Elder Research?
Competitive Salary and Benefits
Important Work / Make a Difference supporting U.S. national security.
Job Stability: Elder Research is not a typical government contractor, we hire you for a career not just a contract.
People-Focused Culture: we prioritize work-life-balance and provide a supportive, positive, and collaborative work environment as well as opportunities for professional growth and advancement.
Company Stock Ownership: all employees are provided with shares of the company each year based on company value and profits.
About Elder Research, Inc
People Centered. Data Driven
Elder Research is a fast growing consulting firm specializing in predictive analytics. Being in the data mining business almost 30 years, we pride ourselves in our ability to find creative, cutting edge solutions to real-world problems. We work hard to provide the best value to our clients and allow each person to contribute their ideas and put their skills to use immediately.
Our team members are passionate, curious, life-long learners. We value humility, servant-leadership, teamwork, and integrity. We seek to serve our clients and our teammates to the best of our abilities. In keeping with our entrepreneurial spirit, we want candidates who are self-motivated with an innate curiosity and strong team work.
Elder Research believes in continuous learning and community - each week the entire company attends a Tech Talk and each office location provides lunch. Elder Research provides a supportive work environment with established parental, bereavement, and PTO policies. By prioritizing a healthy work-life balance - with reasonable hours, solid pay, low travel, and extremely flexible time off - Elder Research enables and encourages its employees to serve others and enjoy their lives.
Elder Research, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
Elder Research is a Government contractor and many of our positions require US Citizenship.
Data Engineer (Hybrid)
Tampa, FL jobs
Job Title: Data Engineer
Workplace: Hybrid, 2-3 days per week onsite at MacDill AFB, FL
Clearance: Top Secret (TS)
Elder Research is seeking Data Engineers to support a U.S. national security client at MacDill AFB in Tampa, FL. In this mission-focused role, you will apply advanced data engineering techniques to enable intelligence analysts to uncover hidden patterns, enhance decision-making, and drive intelligence innovation in support of national security.
This hybrid role offers the opportunity to work at the cutting edge of analytics and defense, directly impacting military operations across the U.S. Intelligence and Defense community. Our team integrates expertise in data science, AI/ML, and intelligence operations to deliver data-driven solutions for the U.S. National Security enterprise. The work directly contributes to decision-making, mission readiness, and the ability of operators to succeed in a complex global battlespace.
Position Requirements:
Education: Bachelor s degree in a technical field (e.g., Engineering, Mathematics, Statistics, Physics, Computer Science, IT, or related discipline).
Clearance: active Top Secret (TS)
Years of Experience: 3+ years
Experience with:
Python, SQL, no SQL, Cypher, POSTGRES
SQLAlchemy, Swagger, Spark, Hadoop, Kafka, Hive, R,
Apache Storm, Neo4J, MongoDB
Cloud platforms (AWS, Azure, GCP, or similar)
Ability to work independently as well as within cross-functional teams.
Strong communication, problem-solving, and critical-thinking skills.
Preferred Skills and Qualifications:
Active TS/SCI clearance.
Experience supporting the intelligence domain, particularly Intelligence, Surveillance, and Reconnaissance (ISR).
Previous work supporting Special Operations Forces (SOF) missions or U.S. national security customers.
Why apply to this position at Elder Research?
Competitive Salary and Benefits
Important Work - Make a Difference supporting U.S. national security.
Job Stability: Elder Research is not a typical government contractor, we hire you for a career not just a contract.
People-Focused Culture: we prioritize work-life-balance and provide a supportive, positive, and collaborative work environment as well as opportunities for professional growth and advancement.
Company Stock Ownership: all employees are provided with shares of the company each year based on company value and profits.
About Elder Research, Inc
People Centered. Data Driven
Elder Research is a fast growing consulting firm specializing in predictive analytics. Being in the data mining business almost 30 years, we pride ourselves in our ability to find creative, cutting edge solutions to real-world problems. We work hard to provide the best value to our clients and allow each person to contribute their ideas and put their skills to use immediately.
Our team members are passionate, curious, life-long learners. We value humility, servant-leadership, teamwork, and integrity. We seek to serve our clients and our teammates to the best of our abilities. In keeping with our entrepreneurial spirit, we want candidates who are self-motivated with an innate curiosity and strong team work.
Elder Research believes in continuous learning and community - each week the entire company attends a Tech Talk and each office location provides lunch. Elder Research provides a supportive work environment with established parental, bereavement, and PTO policies. By prioritizing a healthy work-life balance - with reasonable hours, solid pay, low travel, and extremely flexible time off - Elder Research enables and encourages its employees to serve others and enjoy their lives.
Elder Research, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
Elder Research is a Government contractor and many of our positions require US Citizenship.
Senior Data Engineer
Blue Ash, OH jobs
Job Description
The Engineer is responsible for staying on track with key milestones in Customer Platform / Customer Data Acceleration, work will be on the new Customer Platform Analytics system in Databricks. The Engineer has overall responsibility in the technical design process. Leads and participates in the application technical design process and completes estimates and work plans for design, development, implementation, and rollout tasks. The Engineer also communicates with the appropriate teams to ensure that assignments are delivered with the highest of quality and in accordance to standards. The Engineer strives to continuously improve the software delivery processes and practices. Role model and demonstrate the companys core values of respect, honesty, integrity, diversity, inclusion and safety of others.
Current tools and technologies include:
Databricks and Netezza
Key Responsibilities
Lead and participate in the design and implementation of large and/or architecturally significant applications.
Champion company standards and best practices. Work to continuously improve software delivery processes and practices.
Build partnerships across the application, business and infrastructure teams.
Setting up new customer data platforms from Netezza to Databricks
Complete estimates and work plans independently as appropriate for design, development, implementation and rollout tasks.
Communicate with the appropriate teams to ensure that assignments are managed appropriately and that completed assignments are of the highest quality.
Support and maintain applications utilizing required tools and technologies.
May direct the day-to-day work activities of other team members.
Must be able to perform the essential functions of this position with or without reasonable accommodation.
Work quickly with the team to implement new platform.
Be onsite with development team when necessary.
Behaviors/Skills:
Puts the Customer First - Anticipates customer needs, champions for the customer, acts with customers in mind, exceeds customers expectations, gains customers trust and respect.
Communicates effectively and candidly - Communicates clearly and directly, approachable, relates well to others, engages people and helps them understand change, provides and seeks feedback, articulates clearly, actively listens.
Achieves results through teamwork Is open to diverse ideas, works inclusively and collaboratively, holds self and others accountable, involves others to accomplish individual and team goals
Note to Vendors
Length of Contract 9 months
Top skills Databricks, Netezza
Soft Skills Needed collaborating well with others, working in a team dynamic
Project person will be supporting - staying on track with key milestones in Customer Platform / Customer Data Acceleration, Work will be on the new Customer Platform Analytics system in Databricks that will replace Netezza
Team details ie. size, dynamics, locations most of the team is located in Cincinnati, working onsite at the BTD
Work Location (in office, hybrid, remote) Onsite at BTD when necessary, approximately 2-3 days a week
Is travel required - No
Max Rate if applicable best market rate
Required Working Hours 8-5 est
Interview process and when will it start Starting with one interview, process may change
Prescreening Details standard questions. Scores will carry over.
When do you want this person to start Looking to hire quickly, the team is looking to move fast.