Role: Senior DataEngineer
Term: Contract
The Team
You will be part of a high-performing DataEngineering & Analytics team responsible for building scalable, cloud-native data platforms on AWS. The team partners closely with product, analytics, and business stakeholders to deliver reliable data pipelines that support advanced analytics, reporting, and data-driven decision-making.
This team emphasizes modern dataengineering practices, reusable frameworks, performance optimization, and production-grade data solutions.
As a Senior DataEngineer, you will design, build, and maintain end-to-end data pipelines leveraging Talend Cloud (8.0), PySpark, and AWS services. You will play a key role in ingesting, transforming, and optimizing large-scale structured and unstructured datasets while ensuring scalability, performance, and data quality across the platform.
Key responsibilities include:
Designing and developing ETL/ELT workflows using Talend 8.0 on Cloud
Integrating data from APIs, flat files, and streaming sources
Ingesting and managing data in AWS S3-based data lakes
Developing PySpark jobs for large-scale data processing and transformations
Implementing Spark SQL for complex transformations and schema management
Building and supporting cloud-native data pipelines using AWS services such as Glue, Lambda, Athena, and EMR
Applying performance tuning and optimization techniques for Spark workloads
What You Will Bring
10+ years of experience in dataengineering, big data, or ETL development roles
Strong hands-on expertise with Talend ETL (Talend Cloud / Talend 8.x)
Advanced experience in PySpark and Spark SQL for large-scale data processing
Proficiency in Python, including building reusable data transformation modules
Solid experience with AWS data services, including:
S3 for data lake storage and lifecycle management
Glue for ETL/ELT orchestration
Lambda for event-driven processing
Athena for serverless analytics
EMR for Spark/PySpark workloads
Strong understanding of ETL/ELT patterns, data lakes, and distributed systems
Ability to optimize performance, ensure data quality, and build production-ready pipelines
Excellent collaboration skills and experience working with cross-functional teams
$81k-110k yearly est. 1d ago
Looking for a job?
Let Zippia find it for you.
Senior Data Engineer
United States Liability Insurance Group 4.4
Data engineer job in Wayne, PA
Back to Open Opportunities Returning Applicant? Login Now Notice: Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment visa at this time. However, USLI reserves its right to provide employment-based immigrant visa assistance on a discretionary basis.
Explore USLI's extensive company benefits, perks, and more below! *Not applicable to External Customer Program
U.S. Benefits
Canada Benefits
Senior DataEngineer
Location:Wayne, PA Team:Application Development Job Type:Information Technology FT/PT Status:Full Time
Job Title: DataEngineer
Location: Wayne, PA
About Us: At USLI, we're not just about insurance - we are committed to making a difference, both internally and externally. Our community is built upon five values: Caring, Attitude, Respect, Empathy and Energy. Our commitment to these values leads us to make better decisions and furthers our true sense of community. By joining our team, you'll be part of a vibrant organization that values innovation, collaboration and growth. Here, you'll have the opportunity to shape the future of insurance and make a meaningful impact.
Your Role: As a dataengineer, you will lead the design, development and deployment of scalable data solutions across cloud and on-premises environments. You will oversee a high-performing team in building modern data products and platforms, applying core software engineering principles to dataengineering challenges. This is a technical leadership role that requires adaptability, mentorship and hands-on expertise in data systems. You will help shape our data architecture, mentor others and ensure the delivery of high-quality, resilient solutions.
Key Responsibilities:
* Team leadership: Provide day-to-day oversight and technical direction for dataengineering initiatives, offering guidance and support
* Dataengineering: Lead the design and development of data pipelines using Apache Spark and Databricks, incorporating best practices in performance, scalability and maintainability
* Cloud and Lakehouse solutions: Build and optimize data solutions in Azure, leveraging cloud-native PaaS tools and open-source technologies, such as Delta Lake
* Collaboration: Partner closely with software engineers, the architecture team and product owners to deliver data solutions aligned with business goals
* Mentorship: Mentor engineers on data modeling, performance tuning and platform design while fostering a collaborative and growth-oriented team culture
* Innovation: Stay current on emerging trends in cloud data platforms, lakehouse architecture and open-source tools and offer recommendations to improve our stack and practice
* Problem-solving: Apply strong analytical and problem-solving skills to troubleshoot data pipeline issues across varied systems
* Professional development: Participate in and promote learning opportunities through the annual training and development plan, team knowledge sharing and mentorship
What You'll Bring:
* Technical skills:
* Experience building data assets, products and pipelines with Apache Spark and Databricks
* Familiarity with Azure cloud services and PaaS offerings (or equivalent experience with other cloud providers)
* Recent dataengineering experience with Scala or Java (or willingness to learn) is a plus
* Programming skills in SQL and experience with RDBMS systems
* Familiarity with NoSQL databases
* Experience with open-source data lake storage technologies such as Delta Lake
* Knowledge of data lakehouse architectures
* Familiarity with one statically typed enterprise programming language (i.e. C#, Java) is a plus
* Adaptability: Ability to context-switch between multiple priorities and rapidly changing project scopes
* Communication: Strong written and verbal communication skills to clearly convey technical decisions and mentor others
* Leadership: A collaborative, hands-on approach to leadership and technical problem-solving
Qualifications:
* College degree in computer science, software engineering or a related field, or equivalent industrial or technical experience
* Fundamental knowledge of data structures and algorithms is required
* Ten or more years of data and/or software engineering experience
* Experience working in Agile/SCRUM environments and with modern DevOps practices
* 9 a.m. to 5 p.m. ET schedule, with some overtime as needed
* All positions are required to work on-site 75% of the time, unless indicated otherwise in the job description
What We Offer: One of the advantages of working at USLI is the competitive salary and benefits program we offer full-time and eligible part-time employees. Benefits include performance-based triannual bonuses, medical benefits paid at 100% for full-time employees and 80% for eligible part-time employees, a profit-sharing program, free lunch every day while onsite and more than 450 annual personal and professional development courses. Explore more company benefits.
Why USLI? At USLI, we are committed to fostering a vibrant and inclusive community that celebrates the rich diversity of all ethnicities, nationalities, abilities, genders, gender identities, sexual orientations, ages, religions, socioeconomic backgrounds and life experiences. We understand the importance of continuous learning, self-reflection, acknowledging our biases and expanding our perspectives beyond our own. We actively encourage open dialogue on diversity, equity, inclusion, and belonging to support a workplace where every individual feels valued, respected and empowered to contribute at their fullest potential. Join us in building a diverse and inclusive environment where our shared values drive us toward excellence.
$87k-116k yearly est. 60d+ ago
Data Engineer
Penn State University
Data engineer job in Parkesburg, PA
APPLICATION INSTRUCTIONS: * CURRENT PENN STATE EMPLOYEE (faculty, staff, technical service, or student), please login to Workday to complete the internal application process. Please do not apply here, apply internally through Workday. * CURRENT PENN STATE STUDENT (not employed previously at the university) and seeking employment with Penn State, please login to Workday to complete the student application process. Please do not apply here, apply internally through Workday.
* If you are NOT a current employee or student, please click "Apply" and complete the application process for external applicants.
JOB DESCRIPTION AND POSITION REQUIREMENTS:
We are seeking a talented, experienced, and highly-motivated Data Research Engineer to join the Computational Intelligence and Visualization Applications Department of the Applied Research Laboratory (ARL) at Penn State. You will assist in providing our customers with state-of-the-art visualization and decision support software based solutions.
ARL's purpose is to research and develop innovative solutions to challenging scientific, engineering, and technology problems in support of the Navy, the Department of Defense (DoD), and the Intel Community (IC).
ARL is an authorized DoD SkillBridge partner and welcomes all transitioning military members to apply.
You will:
* Assemble large, complex sets of data that meet research requirements
* Build required infrastructure for optimal extraction, transformation and loading of data from various data sources using cloud and SQL technologies
* Integrate analytical tools to utilize the data pipeline, solving research problems posed by the stakeholders and data science team
* Work with a team of engineers, faculty, students and customers and to assist them with data-related technical and resource issues
* Execute tasking within an Agile development process
* Keep current with relevant emerging technologies and trends by attending conferences and workshops relevant to department and project goals
Additional responsibilities of the higher level include:
* Coordinate DataEngineering related research and development activities between disciplines involving exploration of subject area, definition of scope and selection of problems for investigation and development of novel concepts and approaches
* Mentor and train employees in the development of DataEngineering related technical, project, and business development skills
This job will be filled at the Intermediate Professional or Advanced Professional level, depending upon the successful candidate's education, and experience. Minimally requires a Bachelor's Degree in an Engineering or Science discipline plus 2 years of related experience for the intermediate professional level. Additional education and/or experience required for higher level positions.
Required skills/experience areas include:
* Data flow automation using NiFi or similar applications
* PostgreSQL
Preferred skills/experience areas include:
* PostGIS
* Python
* FastAPI
* Security+ or similar level of certification
* Active TS/SCI clearance
* Master's Degree
The position will be located in either State College, PA or Reston, VA.
ARL at Penn State is an integral part of one of the leading research universities in the nation and serves as a University center of excellence in defense science, systems, and technologies with a focus in naval missions and related areas.
You will be subject to a government security investigation, and you must be a U.S. citizen to apply. Employment with the ARL will require successful completion of a pre-employment drug screen.
FOR FURTHER INFORMATION on ARL, visit our web site at ****************
The proposed salary range may be impacted by geographic differential.
The salary range for this position, including all possible grades is:
$86,300.00 - $164,000.00
Salary Structure - additional information on Penn State's job and salary structure.
CAMPUS SECURITY CRIME STATISTICS:
Pursuant to the Jeanne Clery Disclosure of Campus Security Policy and Campus Crime Statistics Act and the Pennsylvania Act of 1988, Penn State publishes a combined Annual Security and Annual Fire Safety Report (ASR). The ASR includes crime statistics and institutional policies concerning campus security, such as those concerning alcohol and drug use, crime prevention, the reporting of crimes, sexual assault, and other matters. The ASR is available for review here.
Employment with the University will require successful completion of background check(s) in accordance with University policies.
EEO IS THE LAW
Penn State is an equal opportunity employer and is committed to providing employment opportunities to all qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. If you are unable to use our online application process due to an impairment or disability, please contact ************.
Federal Contractors Labor Law Poster
PA State Labor Law Poster
Penn State Policies
Copyright Information
Hotlines
University Park, PA
$86.3k-164k yearly Auto-Apply 60d+ ago
Fractional ML Engineer / Data Scientist
JPC Partners 4.1
Data engineer job in Exton, PA
JPC Partners is looking for a Fractional ML Engineer or Data Scientist to help our Cybersecurity client analyze and model data from real-time network traffic logs (primarily Zeek conn.log, DNS logs, and NetFlow). Our goal is to extract actionable insights and build lightweight detection models for anomalous behavior, segmentation policy validation, and traffic classification.
You'll be working with structured log data and should be comfortable designing and evaluating machine learning workflows that can scale or be embedded into lightweight data pipelines (e.g., Jupyter, Python, cloud-ready).
This is part-time/project-based, ideal for someone with a cybersecurity lens and ML fluency. Responsibilities
Explore and model Zeek and/or NetFlow log data
Help improve existing pipeline logic (cleaning, enrichment, labeling)
Build and test supervised and unsupervised models for:
Traffic classification (e.g., system personality or app type)
Anomaly detection (e.g., port scanning, lateral movement)
Baseline behavior for network segmentation enforcement
Optionally develop output for visualization or SIEM dashboards
Example Use Cases
Classify device types based on observed connection patterns
Detect rogue internal services using legacy or high-risk ports
Map internal east-west traffic to segmentation policy gaps
Identify abnormal DNS behavior and data exfiltration attempts
Required Skills
3 - 5+ years of professional experience in Data Science, Machine Learning Engineering, or a related field
Demonstrated experience working with real-world datasets, model deployment, and production-grade ML workflows
Desired Skills
Python (Pandas, Scikit-learn, Jupyter), some SQL
Experience with GCP and implementing cloud-based ML systems
Experience with Zeek, NetFlow, or PCAP-derived data
Familiarity with cybersecurity principles (MITRE ATT&CK, segmentation, IDS logic)
Bonus: TensorFlow/PyTorch, Docker, experience integrating with SIEMs or cloud logging platforms
Experience with implementing LLMs in cloud production environments
$67k-92k yearly est. 60d+ ago
Data Scientist
Fulton Bank 4.7
Data engineer job in Lancaster, PA
Value Proposition
Our values define us and our culture inspires us to change lives for the better. Our employees are the heart and soul of our company, and every success we experience begins with them. Together we are committed to making a positive impact in our local communities. We champion a culture of continuous learning, work-life integration, and inclusion. We promote a digitally enabled work environment to continuously enhance the experience of our employees and customers.
Overview
This is a full-time career opportunity that can be remote.
The Data Scientist will work on a variety of data-driven initiatives or projects that will result in the development and deployment of data-driven solutions to business problems, strategy, and opportunities. Incumbents in this role will analyze data, ask and answer data driven and key business questions, deploy appropriate solutions, and communicate results to stakeholders. The Data Scientist may be expected to design and develop machine learning models, based on specific assignments or team focus.
Responsibilities
Collaborate on Data Science projects with business stakeholders. Explore data and the implementation of varied data-driven methods to impact enterprise-level strategy, facilitate decision making, and achieve key objectives. Apply advanced analytics and machine learning during implementation, as needed based on project scope.
Extract and transform data from multiple sources in preparation for prescriptive or predictive data analysis, modeling, and reporting. Deploy tools to integrate, report, and interpret large data sets with structured and unstructured data. When required, complete model risk management documents.
Apply supervised and unsupervised learning methods to glean meaningful information from various data.
Serve as a data subject matter expert, improve data literacy, and act as a data consultant for assigned areas of the business.
Collaborate with other data stakeholders for support and access to appropriate data and data sources.
Continuously seek out industry best practice and skills development to create new capabilities for data analytics and data science.
Additional Responsibilities
Project management
Utilize publicly available machine learning or deep learning tools such as TensorFlow, Keras, Scikit-learn, etc. in pursuit of business objectives
May mentor and support other data-related roles to provide data-driven tools and insights
Qualifications Education
Bachelor's Degree or the equivalent experience. Specialty: applied math, statistics, computer science, engineering, econometrics, or related field. (Required)
Master's Degree or the equivalent experience. Specialty: applied math, statistics, computer science, engineering, econometrics , or related field. (Preferred)
Experience
5 or more years data mining, machine and/or deep learning algorithms, large datasets and complex relational data models (structured and/or unstructured). (Required)
5 or more years programming with Python. (Preferred)
3 or more years experience with TensorFlow, Scikit-learn, Keras, Torch and NLP libraries. (Preferred)
3 or more years experience with NLP techniques such as tokenization, etc. (Preferred)
2 or more years Cloud computing experience. (Preferred)
CertificationsKnowledge, Skills, and Abilities
Working knowledge of various supervised and unsupervised learning methods such as logistic regression, bagging & boosting, clustering, neural nets, etc. (Required)
Working knowledge of machine learning/deep learning techniques, applications, and libraries. (Required)
Hands-on experience with AWS, GCP, or Azure. Able to utilize API endpoints. (Required)
Effective communications skills and ability to articulate recommendations to varied audiences (Required)
Other Duties as Assigned by Manager
This role may perform other job duties as assigned by the manager. Each employee of the Organization, regardless of position, is accountable for reading, understanding and acting on the contents of all Company-assigned and/or job related Compliance Programs, regulations and policies and procedures, as well as ensure that all Compliance Training assignments are completed by established due dates. This includes but is not limited to, understanding and identifying compliance risks impacting their department(s), ensuring compliance with applicable laws or regulations, and escalating compliance risks to the appropriate level of management.
Pay Transparency
To provide greater transparency to candidates, we share base salary ranges on all job postings regardless of state. We set standard salary ranges for our roles based on the position, function, and responsibilities, as benchmarked against similarly sized companies in our industry. Specific compensation offered will be determined based on a combination of factors including the candidate's knowledge, skills, depth of work experience, and relevant licenses/credentials. The salary range may vary based on geographic location.
The salary range for this position is $99,200.00 - $165,300.00 annually.
Additional Compensation Components
This job is eligible to receive equity in the form of restricted stock units. This job is eligible to participate in a short-term incentive compensation plan subject to individual and company performance.
Benefits
Additionally, as part of our Total Rewards program, Fulton Bank offers a comprehensive benefits package to those who qualify. This includes medical plans with prescription drug coverage; flexible spending account or health savings account depending on the medical plan chosen; dental and vision insurance; life insurance; 401(k) program with employer match and Employee Stock Purchase Plan; paid time off programs including holiday pay and paid volunteer time; disability insurance coverage and maternity and parental leave; adoption assistance; educational assistance and a robust wellness program with financial incentives. To learn more about your potential eligibility for these programs, please visit Benefits & Wellness | Fulton Bank.
EEO Statement Fulton Bank (“Fulton”) is an equal opportunity employer and is committed to providing equal employment opportunity for all qualified persons. Fulton will recruit, hire, train and promote persons in all job titles, and ensure that all other personnel actions are administered, without regard to race, color, religion, creed, sexual orientation, national origin, citizenship, gender, gender identity, age, genetic information, marital status, disability, covered veteran status, or any other legally protected status. Sponsorship Statement
As a condition of employment, individuals must be authorized to work in the United States without sponsorship for a work visa by Fulton Bank currently or in the future.
$99.2k-165.3k yearly Auto-Apply 21d ago
FMA SQL Data Engineer
Nokia 4.6
Data engineer job in Allentown, PA
Job Family Description As an Electrical FMA DataEngineer at Nokia in Allentown, PA, you will work in a collaborative, fast-paced environment where innovation is at the forefront. You will be part of a dynamic cross-functional team, engaging with manufacturing, development, field service, reliability, and software engineers to tackle complex optical line card failures. Your expertise will lead to impactful design improvements and enhanced product reliability. We prioritize employee development through mentoring and offer opportunities for process improvement initiatives. In addition to competitive compensation, we provide relocation support and sponsorship, along with benefits such as health insurance, retirement plans, and professional growth opportunities to ensure a fulfilling work-life balance. Join us to make a tangible impact on the networking solutions industry.
Must-Have:
· BS in Engineering, Computer Science, Math, Physics, or equivalent technical degree
· Experience in SQL server
· Strong analytical and critical thinking skills
· Statistical analysis experience
· Excellent oral and written communication skills
Nice-To-Have:
· Experience in an Opto/Optical environment
· Electrical Circuits, C#, Python
· Proficiency in database queries and statistical analysis
Your responsibilities
· Query large data sets from SQL database.
· Aptitude to work with a large body of data, perform statistical analysis, and comfortable presenting data and proposals to technical audiences.
· Interact with hardware design, test, controls, reliability, and manufacturing to define failure root causes and failure trends.
· Drive improvements in product quality during pilot ramp and identify opportunities to increase product volumes and reduce costs during production.
$87k-117k yearly est. Auto-Apply 1d ago
IT Data Engineer
The Hartford 4.5
Data engineer job in Wayne, PA
Application Dev Analyst - 87ID6E We're determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals - and to help others accomplish theirs, too. Join our team as we help shape the future.
Overview of the role:
Hartford Funds is seeking a motivated Developer to join our IT Team and support our enterprise data warehouse. This position focuses on both data management and development, using low-code ETL tools (such as Talend), SQL, and Python for data ingestion and visualization. The ideal candidate will develop effective solutions, while effectively communicating and collaborating with Business Analysts, Project Managers, and the Business Intelligence Team.
Responsibilities of the role:
+ Conduct data profiling, source-target mappings, (low-code) ETL and/or Python development, SQL tunings and optimization, testing and implementation
+ Provide business and technical analysis for initiatives focusing on Data Warehouse and Business Intelligence solutions
+ Influence and provide input on data warehouse design with experienced-based recommendations for optimal structuring of data to support efficient processing
+ Develop detailed technical specifications and ETL documentation in collaboration with Data Warehouse Architect and Business Intelligence Developers
Requirements:
+ Bachelor's degree in computer science, information technology or another computer-based discipline
+ 3 years' experience developing solutions with a heavy focus on data and data movement
+ Advanced Python skills required
+ Databases: Strong experience with SQL-based databases (Snowflake is a plus) required
+ Platforms: Microsoft / Linux required
+ SQL: Expertise and fluency in SQL language required
+ Integration Patterns: experience with various integration patterns (e.g., Flat Files, Web Services, etc.) required
+ Knowledge of fundamental data modeling concepts (e.g., ER Diagrams, normalization, etc.) required
+ Experience working with low-code tools (such as ETL platforms)
+ Knowledge of fundamental data modeling concepts (e.g., ER Diagrams, normalization, etc.) is required
+ Knowledge of data warehousing techniques (dimensional modeling / star schema, etc.) is required
+ Experience writing technical documentation is required
+ .Net / C# preferred
+ Snowflake / Talend preferred
+ Cloud platform experience (AWS/Azure) preferred
+ XML/XSLT experience is a plus
+ Must be comfortable/excited working in a fast-paced, high impact environment
+ Excellent troubleshooting and problem-solving skills; able to root cause and debug complex code in and efficient manner/with appropriate urgency
+ Ability to obtain a good understanding of the business context of a problem to ensure the solution is optimal in solving the business need
Compensation
The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford's total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is:
$85,000 - $110,000
Equal Opportunity Employer/Sex/Race/Color/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age
About Us (************************************* | Our Culture (******************************************************* | What It's Like to Work Here (************************************************** | Perks & Benefits (*********************************************
Every day, a day to do right.
Showing up for people isn't just what we do. It's who we are - and have been for more than 200 years. We're devoted to finding innovative ways to serve our customers, communities and employees-continually asking ourselves what more we can do.
Is our policy language as simple and inclusive as it can be? Can we better help businesses navigate our ever-changing world? What else can we do to destigmatize mental health in the workplace? Can we make our communities more equitable?
That we can rise to the challenge of these questions is due in no small part to our company values that our employees have shaped and defined.
And while how we contribute looks different for each of us, it's these values that drive all of us to do more and to do better every day.
About Us (*************************************
Our Culture
What It's Like to Work Here (**************************************************
Perks & Benefits
Legal Notice (*****************************************
Accessibility Statement Producer Compensation (**************************************************
EEO
Privacy Policy (**************************************************
California Privacy Policy
Your California Privacy Choices (******************************************************
International Privacy Policy
Canadian Privacy Policy (****************************************************
Unincorporated Areas of LA County, CA (Applicant Information)
MA Applicant Notice (********************************************
Hartford India Prospective Personnel Privacy Notice
$85k-110k yearly 60d+ ago
Data Scientist
APR Supply Co
Data engineer job in Lebanon, PA
Job Description
APR Supply Co. is a century-strong leader in the HVAC and Plumbing distribution industry - and we're redefining how data powers our business. As part of our ongoing digital transformation, we're building a modern, AI-enabled data platform centered on Microsoft Fabric to unlock intelligent insights, automate operations, and elevate customer experience.
We're seeking an accomplished Data Scientist to help architect and advance this platform-integrating structured and unstructured data, enabling intelligent analytics, and driving the adoption of AI technologies across the enterprise. This is a new, high-impact position with strong executive support and tremendous opportunity to shape the future of data and AI at APR.
If you're a data innovator eager to build from the ground up-combining deep technical expertise with strategic influence-this is your chance to make a lasting mark.
The Opportunity
As APR's Data Scientist, you'll take the lead in developing our Microsoft Fabric-based data ecosystem, from data lake ingestion and transformation to the creation of advanced analytics and AI models.
You'll play a hands-on role designing Power BI dashboards and semantic models, while also pioneering the use of multimodal RAG (Retrieval-Augmented Generation), Copilot-style assistants, and AI-driven enrichment of text, image, and video data.
This is a greenfield opportunity to define data excellence at APR-turning raw information into real-time decision intelligence and next-generation automation.
What You'll Do
Develop and maintain our Microsoft Fabric data lake and pipelines (OneLake, Dataflows Gen2, Spark).
Build high-performance Power BI models and dashboards that enable enterprise-wide analytics.
Integrate structured and unstructured data (text, images, documents, video) into a unified data ecosystem.
Contribute to AI-driven solutions, including Retrieval-Augmented Generation (RAG) and Copilot-style tools.
Collaborate with cross-functional teams to turn raw data into strategic business insights.
What You Bring
5+ years of experience in data science, analytics, or BI development.
Strong hands-on skills with Power BI, SQL, and Python.
Experience with Microsoft Fabric, Azure Data Factory, or Synapse Pipelines.
Familiarity with AI, NLP, or unstructured data processing.
Excellent communication skills and the ability to partner with technical and business leaders.
Why Join APR Supply Co.?
Build the Future: Lead the design of our next-generation AI and data platform.
Innovate with Purpose: Apply advanced analytics and AI to solve real business challenges.
High Visibility: Work directly with executive leadership to influence enterprise strategy.
Career Advancement: Grow toward leadership in AI engineering, data architecture, or product analytics.
Join us in shaping the data future of a century-old company transforming through innovation.
#IND-APR
#ZIP-APR
$72k-101k yearly est. 25d ago
Lead Data Engineer - AI/ML
Stanford Health Care 4.6
Data engineer job in Palo Alto, PA
If you're ready to be part of our legacy of hope and innovation, we encourage you to take the first step and explore our current job openings. Your best is waiting to be discovered.
Day - 08 Hour (United States of America) The Lead DataEngineer will be part of a team building Stanford Health Care's (SHC) solutions incorporating Artificial Intelligence including providing health care solutions in the areas of patient care, medical research and administrative services. This group is designed to bring Artificial Intelligence (AI) and other emerging machine learning (ML) based innovations in data science into healthcare and will partner closely with individuals across clinical specialties and operations areas to deploy algorithms that can lead to better patient outcomes.
Reporting to the Data Science Director and working closely with Stanford Medicine's inaugural Chief Data Scientist, this role will be responsible for building, scaling and maintaining the compute frameworks, analysis tooling, model implementations and agentic solutions that form our core AI platform.
This is a Stanford Health Care job.
A Brief Overview
The Lead DataEngineer will be part of a team building Stanford Health Care's (SHC) solutions incorporating Artificial Intelligence including providing health care solutions in the areas of patient care, medical research and administrative services. This group is designed to bring Artificial Intelligence (AI), predictive algorithms and other emerging machine learning (ML) based innovations in data science into healthcare and will partner closely with individuals across clinical specialties and operations areas to deploy algorithms that can lead to better patient outcomes.
Reporting to the Data Science Director and working closely with Stanford Medicine's inaugural Chief Data Scientist, this role will be responsible for maintaining compute frameworks, analysis tooling, and/or model implementations used or created by the team. The individual will design, implement, and support data processing software and infrastructure.
Locations
Stanford Health Care
What you will do
Build end-to-end data pipelines and infrastructure for ML models used by the Data Science team and others at SHC.
Understand the requirements of data processing and analysis pipelines and make appropriate technical design and interface decisions. Elucidating these requirements will require training, developing, and validating researcher-built or vendor provided machine learning algorithms on hospital data as well as working with other members of the data science team.
Understand data flows among the SHC applications and use this knowledge to make recommendations and design decisions for languages, tools, and platforms used in software and data projects.
Troubleshoot and debug environment and infrastructure problems found in production and non-production environments for projects by the Data Science Team.
Work with other groups at SHC and the Technology and Digital Solutions (TDS) group to ensure servers and system maintenance based on updates, system requirements, data usage, and security requirements.
Education Qualifications
Bachelor's or Master's degree in Computer Science, Engineering, or related, or equivalent working experience
Experience Qualifications
5+ years experience in building data infrastructure for analytics teams, including ability to write code in SQL, R, or Python for processing large datasets in distributed cloud environments
Required Knowledge, Skills and Abilities
Experience with cloud deployment strategies and CI/CD.
Experience building and working with data infrastructure in a SaaS environment.
Experience overseeing, developing or implementing machine learning operations (MLOps) processes.
Experience mentoring junior engineers and enforcing best practices around code quality.
Knowledge of multiple programming languages, commitment to choosing languages based on project-specific requirements, and willingness to learn new programming languages as necessary.
Knowledge of resource management and automation approaches such as workflow runners.
Collaborative mentality and excitement for iterative design working closely with the Data Science team.
Physical Demands and Work Conditions
Blood Borne Pathogens
Category II - Tasks that involve NO exposure to blood, body fluids or tissues, but employment may require performing unplanned Category I tasks
These principles apply to ALL employees:
SHC Commitment to Providing an Exceptional Patient & Family Experience
Stanford Health Care sets a high standard for delivering value and an exceptional experience for our patients and families. Candidates for employment and existing employees must adopt and execute C-I-CARE standards for all of patients, families and towards each other. C-I-CARE is the foundation of Stanford's patient-experience and represents a framework for patient-centered interactions. Simply put, we do what it takes to enable and empower patients and families to focus on health, healing and recovery.
You will do this by executing against our three experience pillars, from the patient and family's perspective:
Know Me: Anticipate my needs and status to deliver effective care
Show Me the Way: Guide and prompt my actions to arrive at better outcomes and better health
Coordinate for Me: Own the complexity of my care through coordination
Equal Opportunity Employer Stanford Health Care (SHC) strongly values diversity and is committed to equal opportunity and non-discrimination in all of its policies and practices, including the area of employment. Accordingly, SHC does not discriminate against any person on the basis of race, color, sex, sexual orientation or gender identity and/or expression, religion, age, national or ethnic origin, political beliefs, marital status, medical condition, genetic information, veteran status, or disability, or the perception of any of the above. People of all genders, members of all racial and ethnic groups, people with disabilities, and veterans are encouraged to apply. Qualified applicants with criminal convictions will be considered after an individualized assessment of the conviction and the job requirements.
Base Pay Scale: Generally starting at $94.35 - $125.03 per hour
The salary of the finalist selected for this role will be set based on a variety of factors, including but not limited to, internal equity, experience, education, specialty and training. This pay scale is not a promise of a particular wage.
$84k-107k yearly est. Auto-Apply 54d ago
Lead Data Engineer
Brentwood Industries, Inc. 4.3
Data engineer job in Reading, PA
The Lead DataEngineer will lead the design, development, and administration of critical data pipelines and platform infrastructure within an Azure Databricks environment. This role is focused on building robust, scalable, and secure data solutions that enable efficient data consumption and digital transformation across the organization. The Lead DataEngineer will be responsible for catalog architecture, data quality, security compliance, and stakeholder engagement, while independently managing complex projects from concept to delivery while keeping the organization's unique needs in consideration. Employee may perform other related duties as required to meet the ongoing needs of the organization.
Essential Responsibilities:
Build, improve, manage and administrate a complex set of mission-critical data pipelines and data sets in a Databricks environment, with a heavy emphasis on stability, best practice and long-term continual improvement. Plan out and execute major catalog redesigns and rebuilds with a focus on stability, data integrity and zero data loss.
Troubleshoot and resolve data pipeline, quality and resource issues efficiently with a high level of urgency.
Lead adoption of the Databricks environment and modern methods of data consumption, digital transformation and automation solutions.
Meet with stakeholders at every level of the organization to understand needs, objectives, requirements and set expectations. Train and assist users, teams and departments with furthering their adoption and utilization of Databricks. Assess and determine the success of current platform utilization while also identifying and defining new process improvement opportunities.
Plan the adoption of new features and technologies within Databricks, Azure and the Power Platform.
Independently manage multiple projects (solo and team-based).
Support data security and governance within the Unity Catalog by maintaining data access controls, including PII-management, tokenization, row level security, masking and data usage logging and monitoring. Perform security audits and establish alerts.
Perform data analysis, quality improvements, dashboarding, reporting and visualization of data. Assist data domain owners, report builders and analysts with report development (Power BI), improvement, migration and best practices.
Serve as final approval of releases to production with thorough code and impact reviews, dependencies and change management.
Serve as administrator of Power BI production workspaces, evaluation of best practices and compliance within reports, and certification of Power BI reports.
Administrate all areas of Databricks including user access, resources, cost and security.
Essential Skills:
Expert-level proficiency in developing advanced ETL pipelines using Databricks.
Extensive experience with Unity Catalog, Delta Live Tables, data ingestion, transformation, cleansing, and governance methodologies.
Advanced expertise in Python and SQL within Databricks Notebooks.
Hands-on experience working with large, complex ERP data sets such as SAP, Infor, Dynamics 365, or Oracle.
Strong understanding of Machine Learning (ML), Large Language Models (LLMs), and related concepts.
Experience designing and implementing IoT and real-time streaming data pipelines.
Proficiency with Azure DevOps, including repositories, version control, production deployments, and CI/CD best practices.
Demonstrated experience in Azure and Databricks cost management, including resource administration, cost reporting, budgeting, and alerting.
Bachelor's degree in Computer Science, STEM, or a related field (or equivalent experience).
Minimum 3 years of experience working in a technology-focused role.
At least 1 year of direct, hands-on experience with Databricks in an Azure environment.
Familiarity with Agile/Scrum methodologies and collaborative development environments.
Willingness to travel to Brentwood site locations and attend industry conferences (approximately 10% travel).
Preferred Certifications & Skills:
Databricks Certified DataEngineer Professional
Microsoft Certified: Azure DataEngineer Associate
Experience with R, Scala, or other statistical programming languages
Brentwood offers professional growth potential, pleasant work environment, and an excellent wage and benefits package including 401K w/employer match. Brentwood Industries, Inc., provides equal employment opportunities to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability, or genetics.
At Brentwood, we have a passion for both our products and our people. Our goal as an employer is to help you excel as an individual and as part of a team by providing you with a satisfying, motivating and stimulating work experience. The varied nature of the environment at Brentwood allows you to work alongside industry professionals on a wide range of projects, contributing your knowledge and strengths to develop innovative, market-driven solutions.
$89k-119k yearly est. Auto-Apply 5d ago
Information/Data Architect
Cleonorporated
Data engineer job in Reading, PA
Qualifications Information/Data Architect Reading, PA 6 Months. Information/Data Architects are responsible for assisting in setting up the overall solution environment that solves the business problem from a data or information perspective. The information should have a basic understanding of various data and solution architectures that include leveraging structured and unstructured data.
The individual must be familiar with the following terms and concepts:
Master data management
Metadata management
Data quality
Business intelligence/data warehousing
Data interoperability
Analytics
Data integration and aggregation
The Information Architect should be able to assist in the following:
Set information architecture standards and methodologies.
Identifies the information that the Institute produces and consumes.
Creates the strategic requirements, principles, models and designs for information across the ecosystem.
Assists in the identification of user requirements
Defines the information architecture of the solution
Prepares data models, designing information structure, work-and data flow
The preferred candidate should be familiar with and have experience in developing data models.
Specific to this opportunity, the individual must be familiar with the following technologies:
Pivotal Cloud Foundry
Pivotal Big Data Suite
Data Lakes and other Big Data Concepts with proven experience in using data lakes
Hadoop (Cloudera).
Additional Information
All your information will be kept confidential according to EEO guidelines.
$90k-123k yearly est. 9h ago
Data Architect
Novocure 4.6
Data engineer job in Wayne, PA
We are seeking an experienced and innovative Data Architect to lead the design, development, and optimization of our enterprise data architecture. This individual will play a critical role in aligning data strategy with business objectives, ensuring data integrity, and driving value from data across multiple platforms. The ideal candidate will have deep expertise in data architecture best practices and technologies, particularly across SAP S/4 HANA, Veeva CRM, Veeva Vault, SaaS platforms, Operational Data Stores (ODS), and Master Data Management (MDM) platforms.
This is a full-time, position reporting to the Director, Enterprise Architecture
ESSENTIAL DUTIES AND RESPONSIBILITIES:
* Design, develop, and maintain scalable and secure enterprise data architecture solutions across SAP S/4 HANA, Veeva CRM, and Veeva Vault environments.
* Serve as a subject matter expert for Operational Data Stores and Master Data Management architecture, ensuring clean, consistent, and governed data across the enterprise.
* Collaborate with cross-functional teams to identify data needs, establish data governance frameworks, and define data integration strategies.
* Develop data models, data flows, and system integration patterns that support enterprise analytics and reporting needs.
* Evaluate and recommend new tools, platforms, and methodologies for improving data management capabilities.
* Ensure architectural alignment with data privacy, regulatory, and security standards.
* Provide leadership and mentoring to dataengineering and analytics teams on best practices in data modeling, metadata management, and data lifecycle management.
* Contribute to data governance initiatives by enforcing standards, policies, and procedures for enterprise data.
QUALIFICATIONS/KNOWLEDGE:
* Bachelor's or Master's degree in Computer Science, Information Systems, Data Science, or a related field.
* Minimum of 8+ years of experience in data architecture, data integration, or enterprise data management roles.
* Proven experience in designing and implementing data solutions on SAP S/4 HANA, including integration with other enterprise systems.
* Strong hands-on experience with SaaS platforms, including data extraction, modeling, and harmonization.
* Deep understanding of Operational Data Stores and MDM design patterns, implementation, and governance practices.
* Proficiency in data modeling tools (e.g., Erwin, SAP PowerDesigner), ETL tools (e.g., Business Objects Data Services, SAP Data Services), and integration platforms (e.g., MuleSoft).
* Familiarity with cloud data architecture (e.g., AWS, Azure, GCP) and hybrid data environments.
* Excellent communication and stakeholder management skills.
OTHER:
* Experience with pharmaceutical, life sciences, or regulated industry environments.
* Knowledge of data privacy regulations such as GDPR, HIPAA, and data compliance frameworks
* Ability to travel domestically and internationally as needed for high priority projects
Novocure is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sexual orientation, gender identity, age, national origin, disability, protected veteran status or other characteristics protected by federal, state, or local law. We actively seek qualified candidates who are protected veteran and individuals with disabilities as defined under VEVRAA and Section 503 of the Rehabilitation Act.
Novocure is committed to providing an interview process that is inclusive of our applicant's needs. If you are an individual with a disability and would like to request an accommodation, please email **********************************
ABOUT NOVOCURE:
Our vision
Patient-forward: aspiring to make a difference in cancer.
Our patient-forward mission
Together with our patients, we strive to extend survival in some of the most aggressive forms of cancer by developing and commercializing our innovative therapy.
Our patient-forward values
* innovation
* focus
* drive
* courage
* trust
* empathy
#LI-RJ1
Nearest Major Market: Philadelphia
$97k-129k yearly est. 60d+ ago
Data Developer
ZP Group 4.0
Data engineer job in Wayne, PA
Piper Companies is seeking a Data Developer to join a well-established financial services company. This is a hybrid role located in Wayne, PA and focuses on supporting and enhancing their enterprise data warehouse environment. This role combines hands-on development with data management.
Responsibilities of the Data Developer role include:
* Perform data profiling, create source-to-target mappings, and develop ETL workflows using low-code tools and/or Python
* Optimize SQL queries for performance and reliability
* Participate in testing, deployment, and troubleshooting of data integration processes
* Provide technical and business analysis for projects centered on data warehousing and BI solutions
* Contribute to data warehouse design decisions, ensuring optimal structures for processing and reporting
* Produce detailed technical specifications and maintain ETL documentation in partnership with architects and BI developers
Qualifications for the Data Developer role include:
* Bachelor's degree in Computer Science, IT, or related field
* Minimum 3 years of experience in data-centric development roles
* Strong knowledge in Python
* Strong SQL skills and experience with relational databases (Snowflake experience is a plus)
* Familiarity with Microsoft and Linux environments
* Hands-on experience with integration patterns (e.g., flat files, APIs, web services)
* Solid understanding of data modeling principles (ER diagrams, normalization)
* Exposure to low-code ETL tools
* Knowledge of data warehousing concepts (dimensional modeling, star schema)
* Ability to write clear technical documentation
Compensation for the Data Developer role include:
* $85,000-$110,000
* Comprehensive benefits package
* Hybrid work
Keywords: Insurance, Risk Management, Claims Processing, Customer Service, Policy Administration, Compliance Financial Services, Commercial Insurance, Data Analysis
#LI-TK1 #LI-HYBRID
$85k-110k yearly 12d ago
Data Scientist
Department of Defense
Data engineer job in Bowers, PA
Apply Data Scientist Department of Defense Defense Counterintelligence and Security Agency DCSA Apply Print Share * * * * Save * This job is open to * Requirements * How you will be evaluated * Required documents * How to apply DCSA's PEO (National Background Investigation Services (NBIS)) is looking for non-supervisory Data Scientist (Data Architect/DataEngineer/AI-ML Specialist). The incumbent will provide support, guidance and expertise in various Data Scientist functions. Performs a variety of duties that involve knowledge of a wide variety of Data Scientist applications, environments, and methods. Visit ************ - America's Gatekeeper!
Summary
DCSA's PEO (National Background Investigation Services (NBIS)) is looking for non-supervisory Data Scientist (Data Architect/DataEngineer/AI-ML Specialist). The incumbent will provide support, guidance and expertise in various Data Scientist functions. Performs a variety of duties that involve knowledge of a wide variety of Data Scientist applications, environments, and methods. Visit ************ - America's Gatekeeper!
Overview
Help
Accepting applications
Open & closing dates
01/05/2026 to 01/20/2026
Salary $63,815 to - $111,087 per year Pay scale & grade GG 9 - 11
Locations
1 vacancy in the following locations:
Fort Meade, MD
Hanover, MD
Boyers, PA
Farmers Branch, TX
Show morefewer locations (2)
Quantico, VA
Stafford, VA
Remote job No Telework eligible Yes-as determined by the agency policy. Travel Required 25% or less - You may be expected to travel for this position. Relocation expenses reimbursed No Appointment type Permanent Work schedule Full-time Service Excepted
Promotion potential
13
Job family (Series)
* 1560 Data Science Series
Supervisory status No Security clearance Top Secret Drug test Yes Position sensitivity and risk Special-Sensitive (SS)/High Risk
Trust determination process
* National security
Financial disclosure No Bargaining unit status No
Announcement number DCSA-26-12857533-MP Control number 853560800
This job is open to
Help
Federal employees - Competitive service
Current federal employees whose agencies follow the U.S. Office of Personnel Management's hiring rules and pay scales.
Federal employees - Excepted service
Current federal employees whose agencies have their own hiring rules, pay scales and evaluation criteria.
National Guard and reserves
Current members, those who want to join or transitioning military members.
Special authorities
Individuals eligible under a special authority not listed in another hiring path.
Clarification from the agency
This announcement is open to: Current Federal employees in the competitive and excepted service; Retained Grade Preference (RGP) eligibles, and Military Reserve and National Guard Technician eligibles may also apply.
Videos
Duties
Help
This position may be filled at either GG-9 or GG-11 grade level, please indicate the grade for which you want to be considered. Incumbent may be advanced non-competitively to the established target grade upon meeting developmental benchmarks, performance requirements, and other requirements, as applicable established for the career program or specifically for the incumbent.
As a Data Scientist you will be responsible for the following duties:
GG-09
* Identifies, collects and analyzes data to effectively support and advance projects or programs.
* Applies analytic methods such as regression, decision tree and statistical probability to improve data reliability and quality.
* Analyzes, interprets and correlates qualitative and quantitative data and develop recommended changes to improve results.
* Manages and plans sequence of operations on a regular basis for assigned elements.
* Implements strategies for system development and analysis, operational logistics, project funding/budgeting, project cost, schedule and performance management.
* Collaborate with team members and stakeholders to ensure data accuracy and consistent reporting.
GG-11
* Identifies, collects and analyzes data to effectively support and advance projects or programs.
* Applies analytic methods such as regression, decision tree and statistical probability to improve data reliability and quality.
* Collaborate with team members and stakeholders to ensure data accuracy and consistent reporting.
* Inform management of anticipated revisions to theories, methodologies or procedures to satisfy non-standard complex program or project support of mission priorities.
* Engages in routine contact with high-ranking individuals or groups such as flag officers and senior executive service members from the DOD and Federal Agencies.
Requirements
Help
Conditions of employment
* Must be a US citizen
* Selective Service Requirement: Males born after 12-31-59 must be registered for Selective Service. For more information ******************
* Resume and supporting documents received by 11:59PM EST will be considered
* This is a Drug Testing designated position
* Position is a (DCIPS) position in the Excepted Service under U.S.C. 1601
* Work Schedule: Full Time
* Overtime: Occasionally
* Tour of Duty: Flexible
* PCS (Permanent Change of Station): Not Authorized
* Fair Labor Standards Act (FLSA): Exempt
* Financial Disclosure: Not Required
* Telework Eligibility: Supervisors may approve situational telework on a case-by-case, temporary basis for limited situations.
* If selected, the incumbent must obtain and maintain appropriate security clearance as indicated in job announcement.
* In accordance with DoDD 8140.01/DoDI 8140.02, the position compliance requirement details are outlined in the Defense Cyber Workforce Annex.
* Any Information Technology/Security certifications / education specified in the DCWF Annex as defined by DoD 8140-M within six months of appointment date.
* The incumbent must sign a Statement of Understanding regarding the certification requirements and maintaining the appropriate certification is a condition of employment.
* This position requires Defense Acquisition Workforce Improvement Act Foundational certification in Engineering and Technical Management.
* Applicants are expected to meet all related certification requirements per the Department of Defense Instruction 5000.66.
* Applicants that do not meet this requirement are eligible to be hired but must obtain a Engineering and Technical Management Foundational certification within 3 years (36 months) after the first day of starting in the position.
Qualifications
The experience described in your resume will be evaluated and screened from the Office of Personnel Management's (OPMs) basic qualifications requirements. See: Information Technology (IT) Management Series 2210 (Alternative A) for OPM qualification standards, competencies and specialized experience needed to perform the duties of the position as described in the MAJOR DUTIES and QUALIFICATIONS sections of this announcement by 01/20/2026
Applicant must have directly applicable experience that demonstrates the possession of the knowledge, skills, abilities and competencies necessary for immediate success in the position. Qualifying experience may have been acquired in any public or private sector job, but will clearly demonstrate past experience in the application of the particular competencies/knowledge, skills and abilities necessary to successfully perform the duties of the position.
GG-09
You must have specialized experience sufficient to demonstrate that you have acquired all the competencies necessary to perform at a level equivalent in difficulty, responsibility, and complexity to the next lower grade GS/GG-07 in the Federal service and are prepared to take on greater responsibility.
Generally, this would include one year or more of such specialized experience.
Specialized experience for this position includes:
* Applying statistical/machine learning algorithms to analyze structured and unstructured data.
* Developing predictive data models, quantitative analyses, and targeting data sources.
* Use of statistical and general-purpose programming languages for data analysis.
OR
Education
Substitution of Education for Specialized Experience -master's or equivalent graduate degree
or
2 full years of progressively higher level graduate education leading to such a degree
or
LL.B. or J.D., if related
GG-11
You must have specialized experience sufficient to demonstrate that you have acquired all the competencies necessary to perform at a level equivalent in difficulty, responsibility, and complexity to the next lower grade GS/GG-09 in the Federal service and are prepared to take on greater responsibility.
Generally, this would include one year or more of such specialized experience.
Specialized experience for this position includes:
* Applying statistical/machine learning algorithms to analyze structured and unstructured data.
* Leading data analysis projects using predictive data models, quantitative analyses, and targeting data sources.
* Explain actionable insights from data analysis to senior management and key stakeholders within DCSA and/or the federal government.
OR
Education
Substitution of Education for Specialized Experience -Ph.D. or equivalent doctoral degree
or
3 full years of progressively higher level graduate education leading to such a degree
or
LL.M., if related
Specifically you will be evaluated on the following competencies for the GG-09 & GG-11:
* Data Mining - Computing process of discovering patterns in a large data set involving methods at the intersection of machine learning, statistics, and database systems.
* Data Management - Knowledge of the principles, procedures, and tools of data management, such as modeling techniques, data backup, data recovery, data dictionaries, data warehousing, data mining, data disposal, and data standardization processes.
* Software and Programming - Understand a variety of programming languages, data warehousing platforms, big data software, and version control applications.
* Qualitative / Quantitative Analysis - Developing and applying quantitative and qualitative analytic methods to identify, collect, process, and analyze large data sets.
Education
If substituting education for experience at the GG-09 level, you must possess a masters or equivalent degree OR 2 full years of progressively higher graduate education leading to such a degree, OR LL.B. or J.D., if related. If substituting education for experience at the GG-11 level, you must possess a Ph.D. or equivalent doctoral degree or 3 full years of progressively higher graduate education leading to such a degree, OR LL.M., if related.
If substituting or combining education for experience, transcripts MUST be provided. Failure to provide transcripts will result in you being rated ineligible for this position.
Superior Academic Achievement does not apply to DCIPS positions.
If substituting education for experience, transcripts MUST be provided. Failure to provide transcripts will result in you being rated ineligible for this position.
Foreign Education: For further information, click on the following link:
*************************************************************************
Additional information
VETERANS PREFERENCE/CURRENT OR FORMER FEDERAL
In accordance with DoD Instruction 1400.25, Volume 2005, veterans preference is not required to be applied when considering candidates with prior Federal competitive or excepted service who have completed a probationary or trial period and have not been separated for cause. Therefore, veterans preference will not be applied to applicants with current federal service, or former federal civilian service meeting the above criteria.
Re-employed Annuitant: This position does not meet criteria for re-employed annuitant. The DoD criteria for hiring Re-employed Annuitants can be found at: *********************************************************************************
Applicants selected from this announcement may be required to serve a two-year trial period.
If selected, Federal employees currently serving in the competitive service must acknowledge that they will voluntarily leave the competitive service by accepting an offer of employment for a DCIPS excepted service positions.
If selected, non-DCIPS candidates must acknowledge in writing that the position they have been selected for is in the excepted service and covered by DCIPS.
All Defense Intelligence positions under the Defense Civilian Intelligence Personnel System (DCIPS) are in the excepted service by specific statue, 10 U.S.C. 1601. This position is in the excepted service and does not confer competitive status.
For more information on the DCIPS occupational structure click here.
For more information about career advancement in DCIPS click here.
ACQUISITION POSITION: This position requires a Defense Acquisition Workforce Improvement Act (DAWIA) Foundational certification in Engineering & Technical Management. Certification prior to hiring is not essential but must be accomplished within 3 years (36 months) after entry into this position. For information regarding the DAWIA Back to Basics please visit: Home (dau.edu) or Back to Basics - Helpful Resources (dau.edu)
* DAWIA (Back to Basics) Certification: NON-CAP: Position requires DoD Acquisition Foundational certification in Engineering & Technical Management within required timeframes. Selectee must also achieve 80 hours of Continuous Learning Points (CLPs) every 2-years. Click here for more details and Resources.
Expand Hide additional information
Candidates should be committed to improving the efficiency of the Federal government, passionate about the ideals of our American republic, and committed to upholding the rule of law and the United States Constitution.
Benefits
Help
A career with the U.S. government provides employees with a comprehensive benefits package. As a federal employee, you and your family will have access to a range of benefits that are designed to make your federal career very rewarding. Opens in a new window Learn more about federal benefits.
Review our benefits
Eligibility for benefits depends on the type of position you hold and whether your position is full-time, part-time or intermittent. Contact the hiring agency for more information on the specific benefits offered.
How you will be evaluated
You will be evaluated for this job based on how well you meet the qualifications above.
You will be evaluated based on how well you meet the qualifications listed in this vacancy announcement. Your qualifications will be evaluated based on your application materials (e.g., resume, supporting documents), and the responses you provide on the application questionnaire.
EXCEPTED SERVICE PILOT REVIEW: Once the application process is complete, a review of your application will be made to ensure you meet the qualification requirements listed on this announcement. All applicants who meet the basic qualification requirements will be forwarded to the Selecting Official for consideration.
Competencies: Customer Service, Decision Making, Flexibility, and Planning and Evaluating
You should list any relevant performance appraisals and incentive awards in your resume as that information may be taken into consideration during the selection process. If selected, you may be required to provide supporting documentation.
Top ranked candidates will be referred to the selecting official for further review and consideration.
A review of your resume and supporting documentation will be made and compared against your responses to the Assessment Questionnaire to determine if you are qualified for this job. If your resume is incomplete or does not support the responses provided in the Assessment Questionnaire, or if you fail to submit all required documentation, you will be rated 'ineligible', 'not qualified'.
Please follow all instructions carefully. Errors or omissions may affect your rating or consideration for employment.
All eligibility and qualifications requirements must be met by the closing date of this announcement.
Benefits
Help
A career with the U.S. government provides employees with a comprehensive benefits package. As a federal employee, you and your family will have access to a range of benefits that are designed to make your federal career very rewarding. Opens in a new window Learn more about federal benefits.
Review our benefits
Eligibility for benefits depends on the type of position you hold and whether your position is full-time, part-time or intermittent. Contact the hiring agency for more information on the specific benefits offered.
Required documents
Required Documents
Help
The documents you are required to submit vary based on the authority you are using to apply (i.e., applying as a veteran, applying as a current permanent Federal employee, applying as a reinstatement, etc.). Your complete application includes your COMPLETE resume, your responses to the online questionnaire, and documents which prove your eligibility to apply. If you fail to provide these documents, you will be marked as having an incomplete application package and you will not be considered any further.
The following documents are REQUIRED:
1. Your resume:
* All applicants are required to submit a resume limited to two (2) pages showing all relevant experience. Your resume must address your qualifications for the position, and it may not exceed two (2) pages. Your 2-page (or less) resume must be in PDF format, with no smaller than a 0.5" margin and no smaller than a 10-point font. If you submit a resume that exceeds the two-page limit and/or does not meet the margin or font limitations stated herein, you will immediately be deemed ineligible for this position and will be removed from further consideration. Must include the work schedule, hours worked per week, dates of employment, and duties performed.
* If you submit more than one copy of your resume, only the most recent version will be reviewed. The latest timestamp will be used to determine which version of your resume is "most recent."
* If your resume includes a photograph or other inappropriate material or content, it will not be used to make eligibility and qualification determinations and you will not be considered for this vacancy.
* Narrative responses in the assessment questionnaire do not replace content in your resume or vise a versa. Experience must be described in both places for qualification determinations.
* Resumes should NOT include: Classified information/SSN/Photo of yourself/personal information (gender, religion, affiliation etc., Encrypted digitally signed docs). Resumes with this prohibited information will be automatically ineligible for consideration.
* For qualifications determinations your resume must contain the dates of employment (i.e., Month/Year to present. Hours per Week).
* If your resume does not clearly outline details for each position as noted, you may be deemed "not referred" for this position.
* Additional guidance on writing a federal resume can be found at: USAJOBS Help Center - How do I write a resume for a federal job? The resume builder can help you create a resume using these recommendations and uses the information in your USAJOBS profile to help you get started.
2. Transcripts:
* College transcript (s), required if qualifying based on education. We accept unofficial transcripts, as long as they contain your name, the name of the school, the date and degree that was awarded, and the list of classes and credits earned.
3. SF50
* All current and former civilian Federal employees must submit a copy of your MOST RECENT SF50 (Notification of Personnel Action) showing your tenure, grade and step, salary, and type of position occupied (i.e., Excepted or Competitive); or similar Notification of Personnel Action documentation, i.e., Transcript of Service, Form 1150, etc. Failure to provide latest SF50 may result in you being rated ineligible for this position.
4. Veteran's Documents:
* If applying using veteran's preference or under a Veteran's hiring authority you must submit the following documents: DD214 showing character of service, SF-15 Form and VA letter showing final percentage, or certification of expected discharge or release from active duty.
NOTE: Active duty military members are not eligible for appointment unless currently on terminal leave.
PLEASE REVIEW THE BELOW LINK FOR OTHER SUPPORTING DOCUMENTS needed for proof of eligibility:
***********************************************************************************************************
If you are relying on your education to meet qualification requirements:
Education must be accredited by an accrediting institution recognized by the U.S. Department of Education in order for it to be credited towards qualifications. Therefore, provide only the attendance and/or degrees from schools accredited by accrediting institutions recognized by the U.S. Department of Education.
Failure to provide all of the required information as stated in this vacancy announcement may result in an ineligible rating or may affect the overall rating.
How to Apply
Help
To apply for this position, you must complete the online application and submit the documentation specified in the Required Documents section below.
The complete application package must be submitted by 11:59 PM (EST) on the closing date to receive consideration.
* To begin, click Apply Online to access an online application. Follow the prompts to select your USAJOBS resume and/or other supporting documents. You will need to be logged into your USAJOBS account or you may need to create a new account.
* You will be taken to an online application. Complete the online application, verify the required documentation, and submit the application. NOTE: Resumes up to a total of two pages will be accepted. Resumes exceeding two pages will be removed from consideration.
* You will receive an email notification when your application has been received for the announcement.
* To verify the status of your application, log into your USAJOBS account, ************************ select the Application Status link and then select the More Information link for this position. The Application Status page will display the status of your application, the documentation received and processed, and your responses submitted to the online application. Your uploaded documents may take several hours to clear the virus scan process.
To preview the questionnaire, please go to *********************************************************
Please review the General Application Information and Definitions at:
**************************************************************************************************************************
Agency contact information
DCSA Servicing Team
Phone ************ Email ****************** Address Defense Counterintelligence and Security Agency
27130 Telegraph Road
Quantico, VA 22134
US
Next steps
Once you successfully complete the application process, you will receive a notification of receipt. Your application package will be reviewed to ensure you meet the basic eligibility and qualifications requirements, and you will receive a notification. A review will be made of your online questionnaire and the documentation you submitted to support your responses. A list of qualified applicants will be created and sent to the selecting official. All applicants reviewed and/or referred will receive a notification letter.
The selecting official may choose to conduct interviews, and once the selection is made, you will receive a notification of the decision.
* NOTE: If you submit a resume but no questionnaire, you cannot be considered for the position. If you submit a questionnaire but no resume, you cannot be considered for the position. Your application will be appropriately documented and you will be removed from further competition against this announcement.
REGARDING INTERVIEWS: Interviews may be required for this position. Accommodations may be made to conduct telephonic interviews to preclude travel hardships for applicants. Note: Declining to be interviewed or failure to report for a scheduled interview will be considered as a declination for further consideration for employment against this vacancy.
The Defense Counterintelligence and Security Agency provides reasonable accommodations to applicants with disabilities. If you need a reasonable accommodation for any part of the application and hiring process, please notify the point of contact for this job announcement. Your requests for reasonable accommodation will be addressed on a case-by-case basis.
This announcement may be used to fill additional vacancies.
Fair and transparent
The Federal hiring process is set up to be fair and transparent. Please read the following guidance.
Criminal history inquiries Equal Employment Opportunity (EEO) Policy
Financial suitability New employee probationary period
Privacy Act Reasonable accommodation policy
Selective Service Signature and false statements
Social security number request
Required Documents
Help
The documents you are required to submit vary based on the authority you are using to apply (i.e., applying as a veteran, applying as a current permanent Federal employee, applying as a reinstatement, etc.). Your complete application includes your COMPLETE resume, your responses to the online questionnaire, and documents which prove your eligibility to apply. If you fail to provide these documents, you will be marked as having an incomplete application package and you will not be considered any further.
The following documents are REQUIRED:
1. Your resume:
* All applicants are required to submit a resume limited to two (2) pages showing all relevant experience. Your resume must address your qualifications for the position, and it may not exceed two (2) pages. Your 2-page (or less) resume must be in PDF format, with no smaller than a 0.5" margin and no smaller than a 10-point font. If you submit a resume that exceeds the two-page limit and/or does not meet the margin or font limitations stated herein, you will immediately be deemed ineligible for this position and will be removed from further consideration. Must include the work schedule, hours worked per week, dates of employment, and duties performed.
* If you submit more than one copy of your resume, only the most recent version will be reviewed. The latest timestamp will be used to determine which version of your resume is "most recent."
* If your resume includes a photograph or other inappropriate material or content, it will not be used to make eligibility and qualification determinations and you will not be considered for this vacancy.
* Narrative responses in the assessment questionnaire do not replace content in your resume or vise a versa. Experience must be described in both places for qualification determinations.
* Resumes should NOT include: Classified information/SSN/Photo of yourself/personal information (gender, religion, affiliation etc., Encrypted digitally signed docs). Resumes with this prohibited information will be automatically ineligible for consideration.
* For qualifications determinations your resume must contain the dates of employment (i.e., Month/Year to present. Hours per Week).
* If your resume does not clearly outline details for each position as noted, you may be deemed "not referred" for this position.
* Additional guidance on writing a federal resume can be found at: USAJOBS Help Center - How do I write a resume for a federal job? The resume builder can help you create a resume using these recommendations and uses the information in your USAJOBS profile to help you get started.
2. Transcripts:
* College transcript (s), required if qualifying based on education. We accept unofficial transcripts, as long as they contain your name, the name of the school, the date and degree that was awarded, and the list of classes and credits earned.
3. SF50
* All current and former civilian Federal employees must submit a copy of your MOST RECENT SF50 (Notification of Personnel Action) showing your tenure, grade and step, salary, and type of position occupied (i.e., Excepted or Competitive); or similar Notification of Personnel Action documentation, i.e., Transcript of Service, Form 1150, etc. Failure to provide latest SF50 may result in you being rated ineligible for this position.
4. Veteran's Documents:
* If applying using veteran's preference or under a Veteran's hiring authority you must submit the following documents: DD214 showing character of service, SF-15 Form and VA letter showing final percentage, or certification of expected discharge or release from active duty.
NOTE: Active duty military members are not eligible for appointment unless currently on terminal leave.
PLEASE REVIEW THE BELOW LINK FOR OTHER SUPPORTING DOCUMENTS needed for proof of eligibility:
***********************************************************************************************************
If you are relying on your education to meet qualification requirements:
Education must be accredited by an accrediting institution recognized by the U.S. Department of Education in order for it to be credited towards qualifications. Therefore, provide only the attendance and/or degrees from schools accredited by accrediting institutions recognized by the U.S. Department of Education.
Failure to provide all of the required information as stated in this vacancy announcement may result in an ineligible rating or may affect the overall rating.
$63.8k-111.1k yearly 14d ago
Software Engineer II
Cox Enterprises 4.4
Data engineer job in Exton, PA
Company Cox Automotive - USA Job Family Group Engineering / Product Development Job Profile Software Engineer II Management Level Individual Contributor Flexible Work Option Hybrid - Ability to work remotely part of the week Travel % No Work Shift Day Compensation
Compensation includes a base salary of $98,300.00 - $147,500.00. The base salary may vary within the anticipated base pay range based on factors such as the ultimate location of the position and the selected candidate's knowledge, skills, and abilities. Position may be eligible for additional compensation that may include an incentive program.
Job Description
Our Software Engineers are energetic influencers who thrive on designing simple and scalable solutions to complex problems and delivering leading edge software products for our customers. We are looking for exceptionally ambitious and communicative hands-on individuals who are comfortable collaborating within the Agile methodology as part of a cross-functional team, have experience working in fast-paced environments, and have the passion and skills to take our product offerings to the next level.
As a Software Engineer II, you will work with other engineers on our team, delivering solutions for internal and external users in a collaborative team environment that encourages you to perform at your best. While contributing to the engineering efforts of one of our teams, you will be challenged to engineer right-sized solutions for complex business problems. You will apply your knowledge of modern software design, best practices, design patterns and frameworks, while ensuring our applications are performant and maintainable. You will aspire to use new technologies and challenge yourself to develop innovative solutions. You will work alongside developers and technical leads on a team where collaborative programming and mentoring is regularly practiced. Our applications are written in .Net Core and hosted in AWS.
Your Role
* Construct and manage services published to both internal and external consumers.
* Implement platform-level components including event architectures, messaging, and caching solutions.
* Write readable, maintainable, and efficient code.
* Design and implementation of REST APIs, services, system tasks and cloud solutions.
* Enhance performance and reliability of our current solutions.
* Being a passionate and flexible engineer who collaborates with your team to achieve the goal of building, deploying, monitoring and managing a highly performing and highly available production system.
* Explore open source or industry standard solutions that could be a fit for the organization.
* Collaborate with team members on best practices, code reviews, internal tools and process improvements.
* Engage with your agile delivery team to build a culture of passion, trust, and creativity.
Technologies we use
* .NET Core, C#, and ASP.NET
* Front End -Web Components, Stencil, React, NodeJS, Redux, Jest, Ext JS
* Database - DynamoDB, Postgres, Elasticsearch, Oracle
* AWS Cloud infrastructure
* ECS Fargate, EKS Fargate
* Lambda
* RDS
* Aurora
* S3
* EC2
* SQS/SNS
* VPC
* API Gateway
* CloudWatch
* Terraform
* Cake Builds
* GitHub Actions
Qualifications:
* Bachelor's degree in a related discipline and 2 years' experience in a related field. The right candidate could also have a different combination, such as a master's degree and up to 2 years' experience; or 14 years' experience in a related field.
* Object-oriented design experience
* Strong C#.NET skills, including:
* Extensive knowledge of .NET technology platforms
* Applied use of design patterns and REST
* MVC technologies such as AngularJS or ReactJS
* Experience in realizing applications from conception and design, to implementation and support.
* Experience designing and implementing applications with highly optimized and scalable architectures.
* Bachelor's degree in a related discipline and 2 years' experience in a related field. The right candidate could also have a different combination, such as a master's degree and up to 2 years' experience; or 14 years' experience in a related field.
* Ability to work on multiple projects and be flexible to adapt to changing requirements.
* Ability to turn high-level requirements into a working system through iterative development.
* Proven ability to work collaboratively and independently to design, develop and deploy solutions.
* High energy, confident, ambitious, and self-motivated individual.
* Must be an effective communicator.
* Engages and mentors other Software Engineers.
* Embrace and learn new technologies.
Drug Testing
To be employed in this role, you'll need to clear a pre-employment drug test. Cox Automotive does not currently administer a pre-employment drug test for marijuana for this position. However, we are a drug-free workplace, so the possession, use or being under the influence of drugs illegal under federal or state law during work hours, on company property and/or in company vehicles is prohibited.
Benefits
The Company offers eligible employees the flexibility to take as much vacation with pay as they deem consistent with their duties, the company's needs, and its obligations; seven paid holidays throughout the calendar year; and up to 160 hours of paid wellness annually for their own wellness or that of family members. Employees are also eligible for additional paid time off in the form of bereavement leave, time off to vote, jury duty leave, volunteer time off, military leave, and parental leave.
About Us
Through groundbreaking technology and a commitment to stellar experiences for drivers and dealers alike, Cox Automotive employees are transforming the way the world buys, owns, sells - or simply uses - cars. Cox Automotive employees get to work on iconic consumer brands like Autotrader and Kelley Blue Book and industry-leading dealer-facing companies like vAuto and Manheim, all while enjoying the people-centered atmosphere that is central to our life at Cox. Benefits of working at Cox may include health care insurance (medical, dental, vision), retirement planning (401(k)), and paid days off (sick leave, parental leave, flexible vacation/wellness days, and/or PTO). For more details on what benefits you may be offered, visit our benefits page. Cox is an Equal Employment Opportunity employer - All qualified applicants/employees will receive consideration for employment without regard to that individual's age, race, color, religion or creed, national origin or ancestry, sex (including pregnancy), sexual orientation, gender, gender identity, physical or mental disability, veteran status, genetic information, ethnicity, citizenship, or any other characteristic protected by law. Cox provides reasonable accommodations when requested by a qualified applicant or employee with disability, unless such accommodations would cause an undue hardship.
$98.3k-147.5k yearly Auto-Apply 10d ago
Software Engineer (Must be a US Citizen)
Ierus Technologies, Inc.
Data engineer job in Valley, PA
IERUS Technologies is a growing, employee-owned business focused on providing leading-edge solutions across the electromagnetic spectrum. Our work supports defense and commercial customers with expertise in software development, systems engineering, and hardware design. We seek motivated engineers who thrive in collaborative, fast-paced environments and who want opportunities to contribute to multiple projects over time.
Position Overview:
IERUS is seeking a Software Engineer with strong C/C++ skills to support development, integration, and testing of systems for defense applications. This role will focus on National Instruments (NI) environments for test and control software, with opportunities to expand into other projects as business needs evolve. While LabWindows/CVI and LabVIEW are highly relevant, engineers with solid C/C++ backgrounds can learn these tools on the job.
This position requires periodic travel to Valley Forge, PA (up to 50% during contract execution).
Responsibilities:
Design, develop, integrate, and maintain C/C++ applications for test, control, and data acquisition systems.
Support development in National Instruments environments including LabWindows/CVI and LabVIEW.
Contribute to system verification, validation, and hardware-in-the-loop testing.
Perform software design, coding, debugging, documentation, and unit testing.
Collaborate with hardware, FPGA, and embedded engineers to support cross-disciplinary efforts.
Participate in peer reviews and ensure adherence to coding standards.
Support the transition of software into integration, operational, and sustainment phases
Required Qualifications:
Bachelor's degree in Computer Science, Software Engineering, Electrical Engineering, or related field.
2+ years of professional experience with C/C++.
U.S. Citizenship
Active Secret clearance.
Familiarity with Git and modern configuration management practices.
Strong problem-solving, communication, and teamwork skills.
Desired Qualifications:
Experience with LabWindows/CVI or LabVIEW (National Instruments products).
Background in FPGA development and hardware description languages (HDLs) such as Verilog or VHDL.
Experience with embedded software development and real-time systems.
Exposure to automated test equipment and hardware-in-the-loop environments.
Familiarity with Agile development practices and task tracking tools such as Jira.
Ability to support multiple projects and adapt quickly to new challenges.
Location: Valley Forge, PA
IERUS Technologies is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age, or any other federally protected class.
IERUS Technologies participates in E-Verify.
$69k-93k yearly est. 60d+ ago
Software Engineer, Platform - Allentown, USA
Speechify
Data engineer job in Allentown, PA
The mission of Speechify is to make sure that reading is never a barrier to learning.
Over 50 million people use Speechify's text-to-speech products to turn whatever they're reading - PDFs, books, Google Docs, news articles, websites - into audio, so they can read faster, read more, and remember more. Speechify's text-to-speech reading products include its iOS app, Android App, Mac App, Chrome Extension, and Web App. Google recently named Speechify the Chrome Extension of the Year and Apple named Speechify its 2025 Design Award winner for Inclusivity.
Today, nearly 200 people around the globe work on Speechify in a 100% distributed setting - Speechify has no office. These include frontend and backend engineers, AI research scientists, and others from Amazon, Microsoft, and Google, leading PhD programs like Stanford, high growth startups like Stripe, Vercel, Bolt, and many founders of their own companies.
Overview
The responsibilities of our Platform team include building and maintaining all backend services, including, but not limited to, payments, analytics, subscriptions, new products, text to speech, and external APIs.
This is a key role and ideal for someone who thinks strategically, enjoys fast-paced environments, is passionate about making product decisions, and has experience building great user experiences that delight users.
We are a flat organization that allows anyone to become a leader by showing excellent technical skills and delivering results consistently and fast. Work ethic, solid communication skills, and obsession with winning are paramount.
Our interview process involves several technical interviews and we aim to complete them within 1 week.
What You'll Do
Design, develop, and maintain robust APIs including public TTS API, internal APIs like Payment, Subscription, Auth and Consumption Tracking, ensuring they meet business and scalability requirements
Oversee the full backend API landscape, enhancing and optimizing for performance and maintainability
Collaborate on B2B solutions, focusing on customization and integration needs for enterprise clients
Work closely with cross-functional teams to align backend architecture with overall product strategy and user experience
An Ideal Candidate Should Have
Proven experience in backend development: TS/Node (required)
Direct experience with GCP and knowledge of AWS, Azure, or other cloud providers
Efficiency in ideation and implementation, prioritizing tasks based on urgency and impact
Preferred: Experience with Docker and containerized deployments
Preferred: Proficiency in deploying high availability applications on Kubernetes
What We Offer
A dynamic environment where your contributions shape the company and its products
A team that values innovation, intuition, and drive
Autonomy, fostering focus and creativity
The opportunity to have a significant impact in a revolutionary industry
Competitive compensation, a welcoming atmosphere, and a commitment to an exceptional asynchronous work culture
The privilege of working on a product that changes lives, particularly for those with learning differences like dyslexia, ADD, and more
An active role at the intersection of artificial intelligence and audio - a rapidly evolving tech domain
The United States Based Salary range for this role is: 140,000-200,000 USD/Year + Bonus + Stock depending on experience
Think you're a good fit for this job?
Tell us more about yourself and why you're interested in the role when you apply.
And don't forget to include links to your portfolio and LinkedIn.
Not looking but know someone who would make a great fit?
Refer them!
Speechify is committed to a diverse and inclusive workplace.
Speechify does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.
$69k-93k yearly est. Auto-Apply 60d+ ago
IT Data Engineer
The Hartford 4.5
Data engineer job in Wayne, PA
Application Dev Analyst - 87ID6E
We're determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals - and to help others accomplish theirs, too. Join our team as we help shape the future.
Overview of the role:
Hartford Funds is seeking a motivated Developer to join our IT Team and support our enterprise data warehouse. This position focuses on both data management and development, using low-code ETL tools (such as Talend), SQL, and Python for data ingestion and visualization. The ideal candidate will develop effective solutions, while effectively communicating and collaborating with Business Analysts, Project Managers, and the Business Intelligence Team.
Responsibilities of the role:
Conduct data profiling, source-target mappings, (low-code) ETL and/or Python development, SQL tunings and optimization, testing and implementation
Provide business and technical analysis for initiatives focusing on Data Warehouse and Business Intelligence solutions
Influence and provide input on data warehouse design with experienced-based recommendations for optimal structuring of data to support efficient processing
Develop detailed technical specifications and ETL documentation in collaboration with Data Warehouse Architect and Business Intelligence Developers
Requirements:
Bachelor's degree in computer science, information technology or another computer-based discipline
3 years' experience developing solutions with a heavy focus on data and data movement
Advanced Python skills required
Databases: Strong experience with SQL-based databases (Snowflake is a plus) required
Platforms: Microsoft / Linux required
SQL: Expertise and fluency in SQL language required
Integration Patterns: experience with various integration patterns (e.g., Flat Files, Web Services, etc.) required
Knowledge of fundamental data modeling concepts (e.g., ER Diagrams, normalization, etc.) required
Experience working with low-code tools (such as ETL platforms)
Knowledge of fundamental data modeling concepts (e.g., ER Diagrams, normalization, etc.) is required
Knowledge of data warehousing techniques (dimensional modeling / star schema, etc.) is required
Experience writing technical documentation is required
.Net / C# preferred
Snowflake / Talend preferred
Cloud platform experience (AWS/Azure) preferred
XML/XSLT experience is a plus
Must be comfortable/excited working in a fast-paced, high impact environment
Excellent troubleshooting and problem-solving skills; able to root cause and debug complex code in and efficient manner/with appropriate urgency
Ability to obtain a good understanding of the business context of a problem to ensure the solution is optimal in solving the business need
Compensation
The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford's total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is:
$85,000 - $110,000
Equal Opportunity Employer/Sex/Race/Color/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age
About Us | Our Culture | What It's Like to Work Here | Perks & Benefits
$85k-110k yearly Auto-Apply 60d+ ago
Information/Data Architect
Cleonorporated
Data engineer job in Reading, PA
Qualifications
Information/Data Architect
Reading, PA
6 Months.
Information/Data Architects are responsible for assisting in setting up the overall solution environment that solves the business problem from a data or information perspective. The information should have a basic understanding of various data and solution architectures that include leveraging structured and unstructured data.
The individual must be familiar with the following terms and concepts:
Master data management
Metadata management
Data quality
Business intelligence/data warehousing
Data interoperability
Analytics
Data integration and aggregation
The Information Architect should be able to assist in the following:
Set information architecture standards and methodologies.
Identifies the information that the Institute produces and consumes.
Creates the strategic requirements, principles, models and designs for information across the ecosystem.
Assists in the identification of user requirements
Defines the information architecture of the solution
Prepares data models, designing information structure, work-and data flow
The preferred candidate should be familiar with and have experience in developing data models.
Specific to this opportunity, the individual must be familiar with the following technologies:
Pivotal Cloud Foundry
Pivotal Big Data Suite
Data Lakes and other Big Data Concepts with proven experience in using data lakes
Hadoop (Cloudera).
Additional Information
All your information will be kept confidential according to EEO guidelines.
$90k-123k yearly est. 60d+ ago
Data Developer
ZP Group 4.0
Data engineer job in Wayne, PA
Piper Companies is seeking an IT Developer - Data Management to join a highly reputable investment management firm supporting their enterprise data warehouse and analytics ecosystem. This onsite role (Wayne, PA - 4 days/week) will play a key part in developing scalable data solutions using low-code ETL tools, SQL, and Python to drive strategic business insights.
RESPONSIBILITIES:
* Perform data profiling, source-to-target mappings, ETL (low-code) and/or Python development, SQL tuning, testing, and implementation
* Contribute to data warehouse design decisions and recommend optimal structuring strategies to improve processing efficiency
* Provide both business and technical analysis to support Data Warehouse and Business Intelligence initiatives
* Develop detailed technical specifications and ETL documentation in collaboration with Data Warehouse Architects and BI Developers
QUALIFICATIONS:
* Bachelor's degree in Computer Science, IT, or related field
* 3+ years of experience building data solutions and managing data movement
* Advanced Python and SQL skills, with experience across Microsoft and Linux environments (required)
* Experience with integration patterns (flat files, APIs/web services) and low-code ETL tools (required)
* Solid understanding of data modeling and warehousing concepts is required (ER diagrams, normalization, dimensional modeling/ star schema)
* Proven ability to write clear technical documentation and troubleshoot complex code
* Ability to translate business needs into effective, value-driven technical solutions
COMPENSATION:
* Salary Range: $85K-$110K
* Full time
* Onsite Requirement: 4 days/week in Wayne, PA
* Comprehensive Benefits: Medical, Dental, Vision, 401K, PTO, Sick Leave as required by law, and Holidays
This job opens for applications on (12/10/2025). Applications for this job will be accepted for at least 30 days from the posting date.
KEYWORDS: Data Management, ETL Development, Python, SQL, Snowflake, Talend, Data Warehouse, Business Intelligence, Dimensional Modeling, ER Diagrams, AWS, Azure, C#, XML, Data Integration
#LI-LR1 #LI-ONSITE
How much does a data engineer earn in Reading, PA?
The average data engineer in Reading, PA earns between $70,000 and $127,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Reading, PA
$95,000
What are the biggest employers of Data Engineers in Reading, PA?
The biggest employers of Data Engineers in Reading, PA are: