Data Scientist
Data engineer job in Wilmington, DE
Role: Data Scientist
Contract: W2 Only (No chance of C2C)
6+ Years experience will work for this role
This person would be helping to support an expanding scope of work involving GenAI and contract intelligence. This falls into the PBM underwriting group.
Focus: Core data science (not data engineering or MLOps)
they do not want someone who gives off a lot of ML engineering they want more core data science.
Key Responsibilities
· Apply traditional machine learning (e.g., decision trees, forecasting)
· Work with GenAI to extract insights from contract documents
· Use Python extensively for data processing and modeling
· Collaborate and communicate effectively on nuanced, domain-specific language
Required Skills
-Strong Python/ Pandas
-SQL (dont need to be as strong here)
-Gen AI (prompt engineering experience is a plus)
-ML
-Statistical experience
-Core data science skills (random trees, forecasting, etc.)
Data Engineer
Data engineer job in Philadelphia, PA
Job Title: Data Engineer
Experience: 5+ years
We are seeking an experienced Data Engineer with strong expertise in PySpark and data pipeline operations. This role focuses heavily on performance tuning Spark applications, managing large-scale data pipelines, and ensuring high operational stability. The ideal candidate is a strong technical problem-solver, highly collaborative, and proactive in automation and process improvements.
Key Responsibilities:
Data Pipeline Management & Support
Operate and support Business-as-Usual (BAU) data pipelines, ensuring stability, SLA adherence, and timely incident resolution.
Identify and implement opportunities for optimization and automation across pipelines and operational workflows.
Spark Development & Performance Tuning
Design, develop, and optimize PySpark jobs for efficient large-scale data processing.
Diagnose and resolve complex Spark performance issues such as data skew, shuffle spill, executor OOM errors, slow-running stages, and partition imbalance.
Platform & Tool Management
Use Databricks for Spark job orchestration, workflow automation, and cluster configuration.
Debug and manage Spark on Kubernetes, addressing pod crashes, OOM kills, resource tuning, and scheduling problems.
Work with MinIO/S3 storage for bucket management, permissions, and large-volume file ingestion and retrieval.
Collaboration & Communication
Partner with onshore business stakeholders to clarify requirements and convert them into well-defined technical tasks.
Provide daily coordination and technical oversight to offshore engineering teams.
Participate actively in design discussions and technical reviews.
Documentation & Operational Excellence
Maintain accurate and detailed documentation, runbooks, and troubleshooting guides.
Contribute to process improvements that enhance operational stability and engineering efficiency.
Required Skills & Qualifications:
Primary Skills (Must-Have)
PySpark: Advanced proficiency in transformations, performance tuning, and Spark internals.
SQL: Strong analytical query design, performance tuning, and foundational data modeling (relational & dimensional).
Python: Ability to write maintainable, production-grade code with a focus on modularity, automation, and reusability.
Secondary Skills (Highly Desirable)
Kubernetes: Experience with Spark-on-K8s, including pod diagnostics, resource configuration, and log/monitoring tools.
Databricks: Hands-on experience with cluster management, workflow creation, Delta Lake optimization, and job monitoring.
MinIO / S3: Familiarity with bucket configuration, policies, and efficient ingestion patterns.
DevOps: Experience with Git, CI/CD, and cloud environments (Azure preferred).
Data Scientist
Data engineer job in Camden, NJ
Title: Data Scientist
Duration: Direct Hire
Schedule: Hybrid (Mon/Fri WFH, onsite Tues-Thurs)
Interview Process: 2 rounds, virtual (2nd/final round is a case study)
Salary Range: $95-120k/yr (with benefits)
Must haves:
1yr min professional/post-grad Data Scientist experience, and knowledge across areas such as Machine Learning, NLP, LLMs, etc
Proficiency in Python and SQL for data manipulation and pipeline development
Strong communication skills for stakeholder engagement
Bachelor's Degree
Plusses
Master's Degree
Azure experience (and/or other MS tools)
Experience working with healthcare data, preferably from Epic
Strong skills in data visualization, dashboard design, and interpreting complex datasets
Day to Day:
We are seeking a Data Scientist to join our clients analytics team. This role focuses on leveraging advanced analytics techniques to drive clinical and business decision-making. You will work with healthcare data to build predictive models, apply machine learning and NLP methods, and optimize data pipelines. The ideal candidate combines strong technical skills with the ability to communicate insights effectively to stakeholders.
Key Responsibilities
Develop and implement machine learning models for predictive analytics and clinical decision support.
Apply NLP and LLM techniques to extract insights from structured and unstructured data.
Build and optimize data pipelines using Python and SQL for ETL processes.
Preprocess and clean datasets to support analytics initiatives.
Collaborate with stakeholders to understand data needs and deliver actionable insights.
Interpret complex datasets and provide clear, data-driven recommendations.
Data Modeler
Data engineer job in Philadelphia, PA
Philadelphia, PA
Hybrid / Remote
Brooksource is seeking an experienced Data Modeler to support an enterprise data warehousing team responsible for designing and implementing information solutions across large operational and analytical systems. You'll work closely with data stewards, architects, and DBAs to understand business needs and translate them into high-quality logical and physical data models that align with enterprise standards.
Key Responsibilities
Build and maintain logical and physical data models for the Active Enterprise Data Warehouse (AEDW), operational systems, and data exchange processes.
Collaborate with data stewards and architects to capture and refine business requirements and translate them into scalable data structures.
Ensure physical models accurately implement approved logical models.
Partner with DBAs on schema design, change management, and database optimization.
Assess and improve existing data structures for performance, consistency, and scalability.
Document data definitions, lineage, relationships, and standards using ERwin or similar tools.
Participate in design reviews, data governance work, and data quality initiatives.
Support impact analysis for enhancements, new development, and production changes.
Adhere to enterprise modeling standards, naming conventions, and best practices.
Deliver high-quality modeling artifacts with minimal supervision.
Required Skills & Experience
5+ years as a Data Modeler, Data Architect, or similar role.
Strong expertise with ERwin or other modeling tools.
Experience supporting EDW, ODS, or large analytics environments.
Proficiency developing conceptual, logical, and physical data models.
Strong understanding of relational design, dimensional modeling, and normalization.
Hands-on experience with Oracle, SQL Server, PostgreSQL, or comparable databases.
Ability to translate complex business requirements into clear technical solutions.
Familiarity with data governance, metadata management, and data quality concepts.
Strong communication skills and ability to collaborate across technical and business teams.
Preferred Skills
Experience in healthcare or insurance data environments.
Understanding of ETL/ELT concepts and how data models impact integration workflows.
Exposure to cloud data platforms (AWS, Azure, GCP) or modern modeling approaches.
Knowledge of enterprise architecture concepts.
About the Team
You'll join a collaborative, fast-moving data warehousing team focused on building reliable, scalable information systems that support enterprise decision-making. This role is key in aligning business needs with the data structures that power core operations and analytics.
Senior Data Engineer
Data engineer job in Philadelphia, PA
We are seeking a passionate and skilled Senior Data Engineer to join our dynamic team in Philadelphia, PA. In this role, you will lead the design and implementation of advanced data pipelines for Business Intelligence (BI) and reporting. Your expertise will transform complex data into actionable insights, driving significant business value for our clients.
Key Responsibilities:
Design and implement scalable and efficient data pipelines for BI and reporting.
Define and manage key business metrics, build automated dashboards, and develop analytic self-service capabilities.
Write comprehensive technical documentation to outline data solutions and architectures.
Lead requirements gathering, solution design, and implementation for data projects.
Develop and maintain ETL frameworks for large real-world data (RWD) assets.
Mentor and guide technical teams, fostering a culture of innovation.
Stay updated with new technologies and solve complex data problems.
Facilitate the deployment and integration of AI models, ensuring data quality and compatibility with existing analytics infrastructure.
Collaborate with cross-functional stakeholders to understand data needs and deliver impactful analytics and reports.
Required Qualifications:
Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
4+ years of SQL experience.
Experience with data modeling, warehousing, and building ETL pipelines.
Proficiency in at least one modern scripting or programming language (e.g., Python, Java, Scala, NodeJS).
Experience working directly with business stakeholders to align data solutions with business needs.
Working knowledge of Snowflake as a data warehousing solution.
Experience with workflow orchestration tools like Apache Airflow.
Knowledge of data transformation tools and frameworks such as dbt (Data Build Tool), PySpark, or Snowpark.
Experience with open-source table formats (e.g., Apache Iceberg, Delta, Hudi).
Familiarity with container technologies like Docker and Kubernetes.
Experience with on-premises and cloud MDM deployments.
Preferred Qualifications:
Proficiency with data visualization tools (e.g., Tableau, Power BI, Quicksight).
Certifications in Snowflake or Azure Data Engineering
Experience with Agile methodologies and project management tools (e.g., Jira).
Experience deploying and managing data solutions within Azure AI, Azure ML, or similar environments.
Familiarity with DevOps practices, particularly CI/CD for data solutions.
Knowledge of emerging data architectures, including Data Mesh, Data Fabric, Multimodal Data Management, and AI/ML integration.
Familiarity with ETL tools like Informatica and Matillion.
Previous experience in professional services or consultancy environments.
Experience in technical pre-sales, solution demos, and proposal development.
Senior Data Engineer
Data engineer job in Philadelphia, PA
Full-time Perm
Remote - EAST COAST ONLY
Role open to US Citizens and Green Card Holders only
We're looking for a Senior Data Engineer to lead the design, build, and optimization of modern data pipelines and cloud-native data infrastructure. This role is ideal for someone who thrives on solving complex data challenges, improving systems at scale, and collaborating across technical and business teams to deliver high-impact solutions.
What You'll Do
Architect, develop, and maintain scalable, secure data infrastructure supporting analytics, reporting, and operational workflows.
Design and optimize ETL/ELT pipelines to integrate data from diverse internal and external sources.
Prepare and transform structured and unstructured data to support modeling, reporting, and advanced analysis.
Improve data quality, reliability, and performance across platforms and workflows.
Monitor pipelines, troubleshoot discrepancies, and ensure accuracy and timely data delivery.
Identify architectural bottlenecks and drive long-term scalability improvements.
Collaborate with Product, BI, Finance, and engineering teams to build end-to-end data solutions.
Prototype algorithms, transformations, and automation tools to accelerate insights.
Lead cloud-native workflow design, including logging, monitoring, and storage best practices.
Create and maintain high-quality technical documentation.
Contribute to Agile ceremonies, engineering best practices, and continuous improvement initiatives.
Mentor teammates and guide adoption of data platform tools and patterns.
Participate in on-call rotation to maintain platform stability and availability.
What You Bring
Bachelor's degree in Computer Science or related technical field.
4+ years of advanced SQL experience (Oracle, PostgreSQL, etc.).
4+ years working with Java or Groovy.
3+ years integrating with SOAP or REST APIs.
2+ years with DBT and data modeling.
Strong understanding of modern data architectures, distributed systems, and performance optimization.
Experience with Snowflake or similar cloud data platforms (preferred).
Hands-on experience with Git, Jenkins, CI/CD, and automation/testing practices.
Solid grasp of cloud concepts and cloud-native engineering.
Excellent problem-solving, communication, and cross-team collaboration skills.
Ability to lead projects, own solutions end-to-end, and influence technical direction.
Proactive mindset with strong analytical and consultative abilities.
Azure Data Architect
Data engineer job in Malvern, PA
Key Responsibilities
1. Design and Build Data Solutions
Architect and implement modern data platforms using Azure services such as Azure Data Factory, Azure Synapse Analytics, Azure Databricks, Data Lake Storage, and Cosmos DB.
Develop and maintain data pipelines for ingestion, transformation, and storage.
Design data models and schemas to support business intelligence and advanced analytics.
2. Data Governance and Compliance
Implement data security, privacy, and compliance measures.
Establish governance frameworks for data quality and lifecycle management.
3. Collaboration and Leadership
Partner with business stakeholders, data scientists, and engineering teams to align architecture with business objectives.
Provide technical leadership, mentoring, and enforce best practices for data engineering teams.
4. Performance and Optimization
Monitor system performance and troubleshoot issues.
Optimize architecture for cost efficiency, scalability, and high availability.
5. Innovation and Enablement
Stay updated on emerging Azure technologies and industry trends.
Conduct workshops and enablement sessions to drive adoption of Azure data solutions.
Required Skills and Experience
Technical Expertise
Strong proficiency in Azure services: Data Factory, Synapse, Databricks, Data Lake, Power BI.
Hands-on experience with data modeling, ETL design, and data warehousing.
Knowledge of SQL, NoSQL, PySpark, and BI tools.
Architecture and Strategy
7+ years in data architecture roles; 3+ years with Azure data solutions.
Familiarity with Lakehouse architecture, Delta/Parquet formats, and data governance tools.
Preferred Qualifications
Experience in regulated industries (e.g., Financial Services).
Knowledge of Microsoft Fabric, Generative AI, and RAG-based architectures.
Education & Certifications
Bachelor's or Master's degree in Computer Science, Information Systems, or related fields.
Certifications such as Microsoft Certified: Azure Solutions Architect Expert or Azure Data Engineer Associate are highly desirable.
Time-Series Data Engineer
Data engineer job in Doylestown, PA
Local Candidates Only - No Sponsorship**
A growing technology company in the Warrington, PA area is seeking a Data Engineer to join its analytics and machine learning team. This is a hands-on, engineering-focused role working with real operational time-series data-not a dashboard or BI-heavy position. We're looking for someone who's naturally curious, self-driven, and enjoys taking ownership. If you like solving real-world problems, building clean and reliable data systems, and contributing ideas that actually get implemented, you'll enjoy this environment.
About the Role
You will work directly with internal engineering teams to build and support production data pipelines, deploy Python-based analytics and ML components, and work with high-volume time-series data from complex systems. This is a hybrid position requiring regular on-site collaboration.
What You'll Do
● Build and maintain data pipelines for time-series and operational datasets
● Deploy Python and SQL-based data processing components using cloud resources
● Troubleshoot issues, optimize performance, and support new customer implementations
● Document deployment workflows and data behaviors
● Work with engineering/domain specialists to identify opportunities for improvement
● Proactively correct inefficiencies-if something can work better, you take the initiative
Required Qualifications
● 2+ years of professional experience in data engineering, data science, ML engineering, or a related field
● Strong Python and SQL skills
● Experience with time-series data or operational/industrial datasets (preferred)
● Exposure to cloud environments; Azure experience is a plus but not required
● Ability to think independently, problem-solve, and build solutions with minimal oversight
● Strong communication skills and attention to detail
Local + Work Authorization Requirements (Strict)
● Must currently live within daily commuting distance of Warrington, PA (Philadelphia suburbs / Montgomery County / Bucks County / surrounding PA/NJ areas)
● No relocation, no remote-only applicants
● No sponsorship-must be authorized to work in the U.S. now and in the future
These requirements are firm and help ensure strong team collaboration.
What's Offered
● Competitive salary + bonus potential
● Health insurance and paid time off
● Hybrid work flexibility
● Opportunity to grow, innovate, and have a direct impact on meaningful technical work
● Supportive, engineering-first culture
If This Sounds Like You
We'd love to hear from local candidates who are excited about Python, data engineering, and solving real-world problems with time-series data.
Work Authorization:
Applicants must have valid, independent authorization to work in the United States. This position does not offer, support, or accept any form of sponsorship-whether employer, third-party, future, contingent, transfer, or otherwise. Candidates must be able to work for any employer in the U.S. without current or future sponsorship of any kind. Work authorization will be verified, and misrepresentation will result in immediate removal from consideration.
Sr. Cloud Data Engineer
Data engineer job in Malvern, PA
Job Title: Sr. Cloud Data Engineer
Duration: 12 months+ Contract
Contract Description:
Responsibilities:
Maintain and optimize AWS-based data pipelines to ensure timely and reliable data delivery.
Develop and troubleshoot workflows using AWS Glue, PySpark, Step Functions, and DynamoDB.
Collaborate on code management and CI/CD processes using Bitbucket, GitHub, and Bamboo.
Participate in code reviews and repository management to uphold coding standards.
Provide technical guidance and mentorship to junior engineers and assist in team coordination.
Qualifications:
9-10 years of experience in data engineering with strong hands-on AWS expertise.
Proficient in AWS Glue, PySpark, Step Functions, and DynamoDB.
Skilled in managing code repositories and CI/CD pipelines (Bitbucket, GitHub, Bamboo).
Experience in team coordination or mentoring roles.
Familiarity with Wealth Asset Management, especially personal portfolio performance, is a plus
Cloud Engineer
Data engineer job in Philadelphia, PA
Pride Health is hiring a Cloud Security Principal Engineer to support our client's medical facility based in Pennsylvania.
This is a 6-month contract with the possibility of an extension, competitive pay and benefits, and a great way to start working with a top-tier healthcare organization.
Job Title: Cloud Security Principal Engineer
Location: Philadelphia, PA 19104 (Hybrid)
Pay Range: $75/hr. - $80.00/hr.
Shift: Day Shift
Duration: 6 months + Possible extension
Job Duties:
Proven experience in securing a multi-cloud environment.
Proven experience with Identity and access management in the cloud.
Proven experience with all security service lines in a cloud environment and the supporting security tools and processes to be successful.
Demonstrate collaboration with internal stakeholders, vendors, and supporting teams to design, implement, and maintain security technologies across the network, endpoint, identity, and cloud infrastructure.
Drive continuous improvement and coverage of cloud security controls by validating alerts, triaging escalations, and working with the MSP to fine-tune detection and prevention capabilities.
Lead or support the development of incident response plans, engineering runbooks, tabletop exercises, and system hardening guides.
Ensure alignment of security architectures with policies, standards, and external frameworks such as NIST SP 800-53, HIPAA, PCI-DSS, CISA ZTMM, CIS Benchmarks, and Microsoft CAF Secure Methodology, AWS CAF, AWS Well-Architected framework, Google CAF.
Participate in design and governance forums to provide security input into infrastructure, DevSecOps, and cloud-native application strategies.
Assist with audits, compliance assessments, risk remediation plans, and evidence collection with internal compliance and external third-party stakeholders.
Required
Bachelor's Degree
At least twelve (12) years industry-related experience, including experience in one to two IT disciplines (such as technical architecture, network management, application development, middleware, information analysis, database management or operations) in a multitier environment.
At least six (6) years' experience with information security, regulatory compliance and risk management concepts.
At least three (3) years' experience with Identity and Access Management, user provisioning, Role Based Access Control, or control self-assessment methodologies and security awareness training.
Experience with Cloud and/or Virtualization technologies.
As a certified minority-owned business, Pride Global and its affiliates - including Russell Tobin, Pride Health, and Pride Now - are committed to creating a diverse environment and are proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, pregnancy, disability, age, veteran status, or other characteristics.
Pride Global offers eligible employee's comprehensive healthcare coverage (medical, dental, and vision plans), supplemental coverage (accident insurance, critical illness insurance and hospital indemnity), a 401(k)-retirement savings, life & disability insurance, an employee assistance program, identity theft protection, legal support, auto and home insurance, pet insurance, and employee discounts with some preferred vendors.
SRE/DevOps w/ HashiCorp & Clojure Exp
Data engineer job in Philadelphia, PA
Locals Only! SRE/DevOps w/ HashiCorp & Clojure Exp Philadelphia, PA: 100% Onsite! 12 + Months
MUST: HashiCorp Clojure
Role: Lead SRE initiatives, automating and monitoring cloud infrastructure to ensure reliable, scalable, and secure systems for eCommerce.
Required: Must Have:
AWS, Terraform, HashiCorp Stack (Nomad, Vault, Consul)
Programming in Python/Clojure
Automation, monitoring, and log centralization (Splunk)
Experience leading large-scale cloud infrastructure
Desired Skills and Experience
Locals Only!
SRE/DevOps w/ HashiCorp & Clojure Exp
Philadelphia, PA: 100% Onsite!
12 + Months
Dexian stands at the forefront of Talent + Technology solutions with a presence spanning more than 70 locations worldwide and a team exceeding 10,000 professionals. As one of the largest technology and professional staffing companies and one of the largest minority-owned staffing companies in the United States, Dexian combines over 30 years of industry expertise with cutting-edge technologies to deliver comprehensive global services and support.
Dexian connects the right talent and the right technology with the right organizations to deliver trajectory-changing results that help everyone achieve their ambitions and goals. To learn more, please visit ********************
Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
Lead Software Engineer
Data engineer job in Philadelphia, PA
Lead Software Engineer - Prominent Financial Services Firm
Our client is a global financial services group which operates across more than 30 markets worldwide, providing a diverse range of banking, financial, advisory, and investment solutions to corporate, institutional, and retail clients. With a heritage spanning over five decades, the organization has built a reputation for specialist expertise in areas including asset management, commodities and global markets, corporate advisory, and infrastructure investment. The firm combines the capabilities of a major international financial institution with a focus on innovation and deep sector knowledge, serving clients across various industries while maintaining a strong commitment to sustainable and responsible business practices.
This position combines direct technical contributions with leadership of your engineering team. You'll guide and develop a small group of engineers through consistent one-on-one meetings, continuous feedback, and personalized career development strategies. Your leadership will ensure successful delivery by orchestrating sprint planning, prioritizing team workloads, and clearing roadblocks that impede progress on quality software releases. Working in close partnership with Product Managers and Business Analysts, you'll establish rigorous standards for code excellence and operational stability while taking full accountability for your team's performance against demanding Service Level Objectives (SLOs) covering availability, performance, and security. Drawing on your technical background, you'll architect and design robust, scalable microservices and systems tailored to financial industry standards. Beyond planning, you'll write code directly-particularly for intricate and mission-critical features-and perform comprehensive code reviews to uphold quality benchmarks. Your responsibilities extend to defining the technical vision and roadmap, making strategic decisions about system architecture and technology selection, and acting as the senior technical resource who guides the team through complex production challenges and troubleshooting scenarios.
Key Responsibilities / Requirements:
7+ years experience as a full-stack developer with expertise in Python
Frontend development experience using React and TypeScript (candidates with Angular or JavaScript backgrounds welcomed if willing to work on UI; AI-assisted development tools available)
Cloud infrastructure experience with AWS as the primary platform; familiarity with Google Cloud for data warehousing solutions (BigQuery)
Proficiency with Airflow for workflow orchestration and automation
Advanced SQL and DBT competency required - data fluency is essential for all team members
Effective at engaging directly with business stakeholders and collaborating with Business Analysts
Background integrating third-party platforms including Aladdin (public markets), Sentry (private markets), and Titan
Asset/liability modeling experience within reinsurance contexts is advantageous
Financial services background preferred, particularly with capital markets systems or portfolio management tools
Knowledge across various asset classes is desirable though not mandatory
Qualifications:
Technical contributor who can simultaneously manage small engineering teams while establishing design standards, architectural patterns, and security protocols
Bachelor's degree
Exceptional communication and negotiation skills
Ability to work in a fast-paced, high-pressure environment
About SEQ Technology:
SEQ is a leading provider of IT Solutions, contract staffing, and permanent employment services. Our firm excels in networking with hard-to-reach candidates for positions ranging from entry-level to Subject Matter Experts and C-level executives. With our extensive expertise in serving buy side financial firms, hedge funds, large sell side financial institutions, and private equity, we are seeking an entry level IT Recruiter to join our team.
SEQ Technology is an Equal Opportunity Employer.
We are committed to fostering a diverse and inclusive workplace that values and respects individuals from all backgrounds. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, or any other protected characteristic. We actively encourage applications from members of underrepresented groups and are dedicated to building a team that reflects the diversity of the communities we serve.
Identity and Access Management - Software Engineering Lead
Data engineer job in Philadelphia, PA
About the role - As Engineering Lead for NeoID-Elsevier's next-generation Identity and Access Management (IAM) platform-you'll leverage your deep security expertise to architect, build, and evolve the authentication and authorization backbone for Elsevier's global products. You'll also lead and manage a team of 5 engineers, fostering their growth and ensuring delivery excellence. You'll have the opportunity to work with industry standard protocols such as OAuth2, OIDC and SAML, as well as healthcare's SMART on FHIR and EHR integrations.
About the team- This team is entrusted with building Elsevier's next-generation Identity and Access Management (IAM) platform. This diverse team of engineers are also building and evolving the authentication and authorization backbone for Elsevier's global products. This team is building a brand new product in Cyber Security that will provide Authorization and Authentication for ALL Elsevier products
Qualifications
Current and extensive experience with at least one major IAM platform (KeyCloak, Auth0, Okta, or similar) - KeyCloak and Auth0 experience are strong pluses. Only candidates with this experience will be considered for this critical role.
Possess an in-depth security mindset, with proven experience designing and implementing secure authentication and authorization systems
Have an extensive understanding of OAuth2, OIDC and SAML protocols, including relevant RFCs and enterprise/server-side implementations
Familiarity with healthcare identity protocols, including SMART on FHIR and EHR integrations
Have current hands-on experience with AWS cloud services and infrastructure management. Proficiency in Infrastructure as Code (IaC) tools, especially Terraform
Strong networking skills, including network security, protocols, and troubleshooting
Familiarity with software development methodologies (Agile, Waterfall, etc.)
Current experience as a people manager of ideally Software and Security professionals.
Experience with Java/J2EE, JavaScript, and related technologies, or willingness to learn and deepen expertise
Knowledge of data modeling, optimization, and secure data handling best practices
Accountabilities
Leading the design and implementation of secure, scalable IAM solutions, with a focus on OAuth2/OIDC and healthcare protocols such as SMART on FHIR and EHR integrations
Managing, mentoring and supporting a team of 5 engineers, fostering a culture of security, innovation, and technical excellence
Collaborating with product managers and stakeholders to define requirements and strategic direction for the platform, including healthcare and life sciences use cases
Writing and reviewing code, performing code reviews, and ensuring adherence to security and engineering best practices
Troubleshooting and resolving complex technical issues, providing expert guidance on IAM, security, and healthcare protocol topics
Contributing to architectural decisions and long-term platform strategy
Staying current with industry trends, emerging technologies, and evolving security threats in the IAM and healthcare space
Prompt Engineer
Data engineer job in Wilmington, DE
Experience in the following:
Project Task Estimation
Resource Scheduling
Risk and Issue Management
Waterfall and Agile methods
Work breakdown and critical path method
Stakeholder Management
Critical Thinking (anticipates problems and establishes methods to mitigate project/program risks)
Benefits Management
Budget/Financial Management including Business Case completion
Executive communications for senior management
Prompt Engineering and Refinement
- Design, test, and optimize prompts to improve LLM accuracy and relevance.
- Analyze model responses to identify areas for prompt improvement.
- Collaborate with internal teams to develop prompt templates and best practices.
- Maintain a repository of effective prompts and document changes for future reference.
Documentation and Reporting
- Prepare clear documentation of processes, findings, and recommendations.
- Assist in drafting user guides, FAQs, and training materials for regulatory reviewers.
Collaboration and Communication
- Work closely with internal teams (AI/ML, compliance, product) to align on requirements and priorities.
- Participate in regular project meetings and status updates.
MLOps Engineer
Data engineer job in Philadelphia, PA
Role : ML Ops Lead
Duration : Long Term
Skills :
4 - 7 years of experience in DevOps, MLOps, platform engineering, or cloud infrastructure.
Strong skills in containerization (Docker, Kubernetes), API hosting, and cloud-native services.
Experience with vector DBs (e.g., FAISS, Pinecone, Weaviate) and model hosting stacks.
Familiarity with logging frameworks, APM tools, tracing layers, and prompt/versioning logs.
Bonus: exposure to LangChain, LangGraph, LLM APIs, and retrieval-based architectures.
Responsibilities :
Set up and manage runtime environments for LLMs, vector DBs, and orchestration flows (e.g., LangGraph).
Support deployments in cloud, hybrid, and client-hosted environments.
Containerize systems for deployment (Docker, Kubernetes, etc.) and manage inference scaling.
Integrate observability tooling: prompt tracing, version logs, eval hooks, error pipelines.
Collaborate on RAG stack deployments (retriever, ranker, vector DB, toolchains).
Support CI/CD, secrets management, error triage, and environment configuration.
Contribute to platform-level IP, including reusable scaffolding and infrastructure accelerators.
Ensure systems are compliant with governance expectations and auditable (esp. in insurance contexts).
Preferred Attributes :
Systems thinker with strong debugging skills..
Able to work across cloud, on-prem, and hybrid client environments.
Comfortable partnering with architects and engineers to ensure smooth delivery.
Proactive about observability, compliance, and runtime reliability.
About ValueMomentum
ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, supported by deep domain and technology capabilities. We offer a comprehensive suite of advisory, development, implementation, and maintenance services across the entire P&C insurance value chain. This includes Underwriting, Claims, Distribution, and more, empowering insurers to stay ahead with sustained growth, high performance, and enhanced stakeholder value. Trusted by over 75 insurers, ValueMomentum is one of the largest standalone insurance-focused solutions providers to the US insurance industry.
Our culture - Our fuel
At ValueMomentum, we believe in making employees win by nurturing them from within, collaborating and looking out for each other.
People first - We make employees win.
Nurture leaders - We nurture from within.
Enjoy wins - Celebrating wins and creating leaders.
Collaboration - A culture of collaboration and people-centricity.
Diversity - Committed to diversity, equity, and inclusion.
Fun - Help people have fun at work.
Software Engineer
Data engineer job in Wilmington, DE
Are you an experienced Software Engineer with a desire to excel? If so, then Talent Software Services may have the job for you! Our client is seeking an experienced Software Engineer to work at their office in Wilmington, DE.
The Software Engineer will join our Research Custom Apps unit at the client to work on various projects involving front-end, back-end, database, and reporting technologies. He/She will primarily work with the team lead for work assignments and collaborate with the business users and other IT technical teams to gather requirements or additional information. This person will be responsible for developing, testing, documenting, and supporting web and mobile applications using a range of tools and platforms. You will also be required to provide on-call support as needed.
Qualifications:
Front-end: React and React Native.
Languages: .NET, C#
OS: Linux, Unix, Windows
Back-end: REST API, Microservices, Node.js
Databases: MSSQL, Postgres
Tools: ADO, Visual Studio, JetBrains Suite, Playstore Apps
Documentation: Functional, Technical specifications and support documentation
Bachelor's Degree in Computer Science/Engineering or related field.
Associate's degree in Computer Science/Engineering or related field with an additional two (2) years of experience as described below:
Working knowledge and experience in Software Engineering
Preferred:
Front-end: Angular, Vue,
Languages: JavaScript, TypeScript
Databases: SQL, NonSQL- MongoDB or CouchDB, MS-Access
Tools: SSMS, SSIS
Reporting: SSRS, Power-BI, and Tableau
Platforms: Azure, GCP
OS: Linux, Unix, Windows
General: Microsoft Office, SharePoint
Software Engineer
Data engineer job in Malvern, PA
Day-to-Day Responsibilities:
Develop and deploy full-stack applications using AWS services (Lambda, S3, DynamoDB, ECS, Glue, Step Functions, and more).
Design, build, and maintain REST and GraphQL APIs and microservices using Python, Java, JavaScript, and Go.
Apply DevOps principles with CI/CD pipelines using Bamboo, Bitbucket, Git, and JIRA.
Monitor product health and troubleshoot production issues with tools like Honeycomb, Splunk, and CloudWatch.
Collaborate with stakeholders to gather requirements, present demos, and coordinate tasks across teams.
Resolve complex technical challenges and recommend enterprise-wide improvements.
Must-Haves:
Minimum 5 years of related experience in software development.
Proficient in AWS services, full-stack development, and microservices.
Experience with Python, Java, JavaScript, and Go.
Strong DevOps experience and familiarity with CI/CD pipelines.
Ability to learn new business domains and applications quickly.
Nice-to-Haves:
Experience with monitoring/observability tools like Honeycomb, Splunk, CloudWatch.
Familiarity with serverless and large-scale cloud architectures.
Agile or Scrum experience.
Strong communication and stakeholder collaboration skills.
Sr AWS Developer
Data engineer job in Malvern, PA
Role: AWS Developer
Project: GIDS Investment Product Valuations Taxonomy
Domain:- GIDS investment (“Investment product master,” “security master,” “reference data,” “pricing and valuations,” “NAV calculation,” “fund accounting,” “portfolio valuations,” or “taxonomy/ontology” experience in financial services.)
Prior work with asset managers, custodians, or banks where they modeled funds, ETFs, accounts, benchmarks, or instrument hierarchies and exposed this via services or data platforms.
Strong hands-on experience in designing, developing, and deploying scalable full-stack applications using AWS technologies (S3, DynamoDB, Postgres, Lambda, CloudFormation, Event bridge, IAM, Glue, Athena).
· Experienced in API design (REST, GraphQL/Super Graph), and microservices.
· Experienced in creating REST APIs using Python, JavaScript, and Go.
· Experienced in development using DevOps principles, tools, and continuous delivery pipelines, including Bamboo, Bitbucket, JIRA, and Git.
· Provides technical expertise and completes complex development, design, implementation, architecture design specification, and maintenance activities.
· Monitor product health in test and production environments using Honeycomb, Splunk, and AWS CloudWatch.
· Responsible for elevating complex code into the development, test, and production environments.
· Resolves highly complex, elevated issues and recommend enterprise-wide improvements and solutions.
· Actively work with the business and stakeholders over the requirements and giving demos after the development and reach out to different teams to accomplish
Microsoft Dynamics 365 CE Data Migration Consultant
Data engineer job in Middletown, PA
Job DescriptionSalary:
Data-Core Systems, Inc. is a provider of information technology, consulting and business process services. We offer breakthrough tech solutions and have worked with companies, hospitals, universities and government organizations. A proven partner with a passion for client satisfaction, we combine technology innovation, business process expertise and a global, collaborative workforce that exemplifies the future of work. For more information about Data-Core Systems, Inc., please visit*****************************
Our client is a roadway system and as a part of their digital transformation they are implementing a solution based on SAP BRIM & Microsoft Dynamics CE.
Data-Core Systems Inc. is seeking Microsoft Dynamics 365 CE Data Migration Consultantto be a part of our Consulting team. You will be responsible for planning, designing, and executing the migration of customer, account, vehicle, financial, and transaction data from a variety of source systemsincluding legacy CRMs, ERPs, SQL databases, flat files, Excel, cloud platforms, and tolling systemsinto Microsoft Dynamics 365 Customer Engagement (CE). This role involves understanding complex data models, extracting structured and unstructured data, transforming and mapping it to Dynamics CE entities, and ensuring data quality, integrity, and reconciliation throughout the migration lifecycle.
Roles & Responsibilities:
Analyze source system data structures, including customer profiles, accounts, vehicles, transponders, payment methods, transactions, violations, invoices, and billing records
Identify critical data relationships, parent/child hierarchies, and foreign key dependencies
Develop detailed data mapping and transformation documentation from source systems to Dynamics 365 CE entities (standard and custom)
Build, test, and execute ETL pipelines using tools such as SSIS/KingswaySoft, Azure Data Factory, Power Platform Dataflows, or custom .NET utilities
Perform data cleansing, normalization, deduplication, and standardization to meet Dynamics CE data model requirements
Execute multiple migration cycles, including test loads, validation, and final production migration
Ensure referential integrity, high data quality, and accuracy of historical data
Generate reconciliation reports, resolve data inconsistencies, and troubleshoot migration errors
Document migration strategies, execution runbooks, and transformation rules for future reference
Required Skills & Experience:
8-12 years of proven experience migrating data from tolling systems, transportation platforms, legacy CRMs, or other high-volume transactional systems
Strong SQL skills for complex queries, stored procedures, data transformation, and data validation
Hands-on experience with Microsoft Dynamics 365 CE / CRM data model, entities, and relationships
Proficiency with ETL/migration tools: SSIS with KingswaySoft, Azure Data Factory, Power Platform Dataflows, Custom C#/.NET migration scripts
Experience with large-scale migrations involving millions of records
Strong understanding of relational data structures such as: Customer Account Vehicle Transponder Transaction
Ability to analyze large datasets, identify anomalies, and resolve inconsistencies
Bachelors degree in engineering or a bachelors degree in technology from a recognized university
Preferred Skills & Experience:
Experience with financial transactions, billing data, or violation/enforcement records.
Experience in enterprise-scale Dynamics 365 CE migrations.
Familiarity with data governance, security, and compliance requirements for financial or transportation data.
Knowledge of historical data migration and archival strategies.
We are an equal opportunity employer.
Data Engineer
Data engineer job in Philadelphia, PA
Data Engineer - Job Opportunity
Full time Permanent
Remote - East coast only
Please note this role is open for US citizens or Green Card Holders only
We're looking for a Data Engineer to help build and enhance scalable data systems that power analytics, reporting, and business decision-making. This role is ideal for someone who enjoys solving complex technical challenges, optimizing data workflows, and collaborating across teams to deliver reliable, high-quality data solutions.
What You'll Do
Develop and maintain scalable data infrastructure, cloud-native workflows, and ETL/ELT pipelines supporting analytics and operational workloads.
Transform, model, and organize data from multiple sources to enable accurate reporting and data-driven insights.
Improve data quality and system performance by identifying issues, optimizing architecture, and enhancing reliability and scalability.
Monitor pipelines, troubleshoot discrepancies, and resolve data or platform issues-including participating in on-call support when needed.
Prototype analytical tools, automation solutions, and algorithms to support complex analysis and drive operational efficiency.
Collaborate closely with BI, Finance, and cross-functional teams to deliver robust and scalable data products.
Create and maintain clear, detailed documentation (configurations, specifications, test scripts, and project tracking).
Contribute to Agile development processes, engineering excellence, and continuous improvement initiatives.
What You Bring
Bachelor's degree in Computer Science or a related technical field.
2-4 years of hands-on SQL experience (Oracle, PostgreSQL, etc.).
2-4 years of experience with Java or Groovy.
2+ years working with orchestration and ingestion tools (e.g., Airflow, Airbyte).
2+ years integrating with APIs (SOAP, REST).
Experience with cloud data warehouses and modern ELT/ETL frameworks (e.g., Snowflake, Redshift, DBT) is a plus.
Comfortable working in an Agile environment.
Practical knowledge of version control and CI/CD workflows.
Experience with automation, including unit and integration testing.
Understanding of cloud storage solutions (e.g., S3, Blob Storage, Object Store).
Proactive mindset with strong analytical, logical-thinking, and consultative skills.
Ability to reason about design decisions and understand their broader technical impact.
Strong collaboration, adaptability, and prioritization abilities.
Excellent problem-solving and troubleshooting skills.