Post job

Data Engineer jobs at W. R. Berkley

- 802 jobs
  • Data Engineer

    The Phoenix Group 4.8company rating

    Fairfield, CT jobs

    Data Engineer - Vice President Greenwich, CT About the Firm We are a global investment firm focused on applying financial theory to practical investment decisions. Our goal is to deliver long-term results by analyzing market data and identifying what truly matters. Technology is central to our approach, enabling insights across both traditional and alternative strategies. The Team A new Data Engineering team is being established to work with large-scale datasets across the organization. This team partners directly with researchers and business teams to build and maintain infrastructure for ingesting, validating, and provisioning large volumes of structured and unstructured data. Your Role As a Data Engineer, you will help design and build an enterprise data platform used by research teams to manage and analyze large datasets. You will also create tools to validate data, support back-testing, and extract actionable insights. You will work closely with researchers, portfolio managers, and other stakeholders to implement business requirements for new and ongoing projects. The role involves working with big data technologies and cloud platforms to create scalable, extensible solutions for data-intensive applications. What You'll Bring 6+ years of relevant experience in data engineering or software development Bachelor's, Master's, or PhD in Computer Science, Engineering, or related field Strong coding, debugging, and analytical skills Experience working directly with business stakeholders to design and implement solutions Knowledge of distributed data systems and large-scale datasets Familiarity with big data frameworks such as Spark or Hadoop Interest in quantitative research (no prior finance or trading experience required) Exposure to cloud platforms is a plus Experience with Python, NumPy, pandas, or similar data analysis tools is a plus Familiarity with AI/ML frameworks is a plus Who You Are Thoughtful, collaborative, and comfortable in a fast-paced environment Hard-working, intellectually curious, and eager to learn Committed to transparency, integrity, and innovation Motivated by leveraging technology to solve complex problems and create impact Compensation & Benefits Salary range: $190,000 - $260,000 (subject to experience, skills, and location) Eligible for annual discretionary bonus Comprehensive benefits including paid time off, medical/dental/vision insurance, 401(k), and other applicable benefits We are an Equal Opportunity Employer. EEO/VET/DISABILITY The Phoenix Group Advisors is an equal opportunity employer. We are committed to creating a diverse and inclusive workplace and prohibit discrimination and harassment of any kind based on race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, genetic information, disability, or veteran status. We strive to attract talented individuals from all backgrounds and provide equal employment opportunities to all employees and applicants for employment.
    $190k-260k yearly 2d ago
  • ERP Data Migration Consultant

    Oscar 4.6company rating

    Lakewood, CO jobs

    Oscar is working with a leading ERP Advisory firm that is looking for an experienced ERP Data Migration Consultant to join their team. As the ERP Data Migration Consultant, you will be responsible for extracting, transforming, and loading legacy data into modern ERP platforms such as NetSuite, Microsoft Dynamics, Acumatica, and others. The ideal candidate is skilled in ETL processes, data mapping, cleansing, and scripting, and is comfortable collaborating directly with clients and cross-functional teams. Key Responsibilities: Develop and maintain ETL scripts to extract, transform, and load data between legacy and ERP systems. Access client legacy systems and convert raw data into structured database formats. Map source data fields to target ERP data structures. Cleanse, verify, and validate data using advanced SQL queries to ensure accuracy and quality. Build SQL stored procedures to convert and prepare legacy data for new ERP environments. Document and optimize data transformation steps and processes. Automate data processing tasks using Microsoft SQL Server tools and scripting. Load validated and transformed data into client ERP systems. Coordinate with Accounting, Operations, and IT teams to ensure technical processes align with business objectives. Deliver accurate, high-quality data migration results within project timelines. Collaborate regularly with the EAG Data Migration team and client stakeholders. Maintain clear communication with the consulting team to support seamless project execution. Qualifications: Bachelor's degree in Business Administration, Information Technology, Computer Information Systems, or a related discipline. 2-4+ years of hands-on experience with SQL Server or MySQL. Experience with Microsoft Access and application development tools. Exposure to leading ERP systems such as NetSuite, Microsoft Dynamics, Acumatica, Infor, Epicor, Sage, Oracle, Workday, etc. Knowledge of business processes in Accounting, Manufacturing, Distribution, or Construction. Advanced proficiency in Microsoft Office applications (Excel, Word, PowerPoint). Professional, approachable, and confident communication style. Recap: Location: Lakewood, CO (Hybrid) Type: Full time Permanent Rate: $80k - $150k annual salary dependent on relevant experience If you think you're a good fit for the role, we'd love to hear from you!
    $80k-150k yearly 4d ago
  • Data Engineer

    Hays 4.8company rating

    Seffner, FL jobs

    Data Engineer (AI/ML Pipelines) - Contract to Hire - Seffner, FL - $48-67/hr. The final salary or hourly wage, as applicable, paid to each candidate/applicant for this position is ultimately dependent on a variety of factors, including, but not limited to, the candidate's/applicant's qualifications, skills, and level of experience as well as the geographical location of the position. Applicants must be legally authorized to work in the United States. Sponsorship not available. Our client is seeking a Data Engineer (AI/ML Pipelines) in Seffner, FL. Responsibilities The Data Engineer - AI/ML Pipelines is a pivotal role responsible for architecting, developing, and maintaining scalable data pipelines that support advanced analytics, machine learning, and real-time operational intelligence across enterprise systems. A key requirement for this role is hands-on experience working with Warehouse Management Systems (WMS), including the ability to ingest, normalize, and interpret data from WMS platforms to support business-critical operations. You'll collaborate closely with software engineers, data scientists, and cross-functional teams to drive initiatives in data transformation, governance, and automation. The role offers exposure to cutting-edge cloud-based technologies, direct mentorship, and well-defined pathways for advancement-including growth into senior or lead data engineering roles, specialization in machine learning engineering, or transition into software engineering roles. Skills & Requirements 3+ years of experience in data engineering Hands-on experience with Databricks, cloud-based data processing, and AI/ML workflows Demonstrated ability to independently build production-grade data pipelines Deep knowledge of data modeling, ETL/ELT, and data architecture in cloud environments. Proficiency in Python, SQL, and data engineering best practices. Benefits/Other Compensation This position is a contract/temporary role where Hays offers you the opportunity to enroll in full medical benefits, dental benefits, vision benefits, 401K and Life Insurance ($20,000 benefit). Why Hays? You will be working with a professional recruiter who has intimate knowledge of the industry and market trends. Your Hays recruiter will lead you through a thorough screening process in order to understand your skills, experience, needs, and drivers. You will also get support on resume writing, interview tips, and career planning, so when there's a position you really want, you're fully prepared to get it. Nervous about an upcoming interview? Unsure how to write a new resume? Visit the Hays Career Advice section to learn top tips to help you stand out from the crowd when job hunting. Hays is committed to building a thriving culture of diversity that embraces people with different backgrounds, perspectives, and experiences. We believe that the more inclusive we are, the better we serve our candidates, clients, and employees. We are an equal employment opportunity employer, and we comply with all applicable laws prohibiting discrimination based on race, color, creed, sex (including pregnancy, sexual orientation, or gender identity), age, national origin or ancestry, physical or mental disability, veteran status, marital status, genetic information, HIV-positive status, as well as any other characteristic protected by federal, state, or local law. One of Hays' guiding principles is ‘do the right thing'. We also believe that actions speak louder than words. In that regard, we train our staff on ensuring inclusivity throughout the entire recruitment process and counsel our clients on these principles. If you have any questions about Hays or any of our processes, please contact us. In accordance with applicable federal, state, and local law protecting qualified individuals with known disabilities, Hays will attempt to reasonably accommodate those individuals unless doing so would create an undue hardship on the company. Any qualified applicant or consultant with a disability who requires an accommodation in order to perform the essential functions of the job should call or text ************. Drug testing may be required; please contact a recruiter for more information.
    $48-67 hourly 4d ago
  • Data Scientist

    Spot Pet Insurance 3.7company rating

    Miami, FL jobs

    Who we are: Spot Pet Insurance is the fastest growing pet insurance company in North America. Our commitment to an exceptional end-to-end customer experience and our data-driven approach have quickly established us as a leading pet insurance provider. We're dedicated to providing pet parents with peace of mind by offering accessible and comprehensive coverage so their furry companions can lead happier, healthier lives. To demonstrate this, we recently joined forces with MrBeast to find homes for 100 homeless pets and committed to giving each of them pet insurance for life! Along the way, we've created a company culture that allows our employees to thrive, with perks like daily free meals, a pet-friendly office, and ridiculously fun company events every quarter. Our dedication to fostering a positive and rewarding work environment for our team has even earned us a Great Place to Work certification. About the Role: Love Pets? Love AI? Let's Talk. We're looking for a Data Scientist who treats AI like a trusted teammate and thrives in a collaborative, fast-moving environment. If you're already using large language models, AI coding assistants, and automated analysis tools every day, you'll fit right in here. At Spot, we help pet parents protect the animals they love. Your work will make that protection smarter and more personal. Key Responsibilities Team up across the company to find problems worth solving with data. Use AI tools (Claude, ChatGPT, Copilot, and others) to write, debug, and ship code faster. Build predictive models for pricing, claims, fraud detection, and customer behavior. Design experiments and measure what works. You know correlation isn't causation. Run marketing mix modeling to show where our dollars work hardest. Create customer models that help us earn trust and keep it. Build internal tools and data products that help your teammates answer their own questions and make better decisions without waiting on you. Share your findings in ways everyone can understand. Skip the jargon. Keep learning. AI and machine learning move fast. So should you. Required Qualifications Degree in Computer Science, Statistics, Mathematics, Engineering, or a related field. Real experience as a Data Scientist in a fast-paced environment. Strong programming skills in Python, R, and SQL, including data and ML libraries (pandas, NumPy, scikit-learn, TensorFlow, PyTorch). Experience with BigQuery and Databricks. Solid grounding in statistics, hypothesis testing, and experimental design. Daily use of AI assistants for coding, analysis, and problem-solving. We'll ask for examples. Experience building dashboards, self-serve tools, or internal data products for non-technical users. You explain complex ideas clearly About AI Proficiency This matters. We'll ask how you use AI tools in your current work. We want specifics, not buzzwords. If AI isn't already part of how you get things done, this role won't be a good fit. But if you're the type who's always looking for ways to work smarter, we'd love to hear from you. About Location We work together in our downtown Miami office five days a week. This is non-negotiable. We believe the best collaboration happens in person, and this role requires it. What we offer: The opportunity to work on challenging and impactful projects at the intersection of design and data. A collaborative and supportive work environment, recognized as a Great Place to Work. Cell phone allowance of $100 per month Health, dental, and visions benefits Life insurance ClassPass Unlimited PTO Bring your pet to work Your pet insurance is covered (Up to $100) 401k with Company match Annual performance-based bonus
    $64k-95k yearly est. 3d ago
  • Antivirus Engineer

    Hays 4.8company rating

    Vienna, VA jobs

    Antivirus Engineer - Contract - Vienna, VA/Remote - $65.00 - $69.20/hr. The final salary or hourly wage, as applicable, paid to each candidate/applicant for this position is ultimately dependent on a variety of factors, including, but not limited to, the candidate's/applicant's qualifications, skills, and level of experience as well as the geographical location of the position. Applicants must be legally authorized to work in the United States. Sponsorship not available. Our client is seeking an Antivirus Engineer in Vienna, VA/Remote. Role Description • Defender Performance Troubleshooting • Diagnose and resolve performance issues related to Microsoft Defender. • Review and interpret Client Analyzer logs. • Utilize tools such as Proc Mon, Mp Performance Recording, and similar for root cause analysis. • Recommend tuning strategies for Defender configurations to minimize resource impact. • Policy Configuration & Deployment • Configure and deploy security policies via Intune, MECM, and Ansible. • Develop and execute testing methodologies for deployments. • Create documentation and adhere to established enterprise processes. • Network & Telemetry Troubleshooting • Perform network diagnostics, including firewall analysis and Splunk queries for traffic validation. • Identify and resolve telemetry gaps or inconsistencies across endpoints. • Compliance & Governance • Review and maintain security exclusions between test and production environments. • Ensure compliance with organizational and regulatory standards. • Microsoft Security Stack Expertise • Hands-on experience with Defender for Endpoint, Microsoft Sentinel, and Azure/Defender for Cloud. • Ability to use advanced hunting queries (KQL) • Security Posture & Risk Assessment • Conduct assessments of current security posture. • Review penetration test findings and recommend remediation strategies. Skills & Requirements • 8+ years experience; hybrid preferred, remote optional • Microsoft Defender troubleshooting (performance issues, Client Analyzer logs, ProcMon, MpPerformanceRecording) • Policy configuration & deployment via Intune, MECM, Ansible; testing and documentation • Network & telemetry troubleshooting (firewall analysis, Splunk queries, telemetry gap resolution) • Compliance & governance (security exclusions, regulatory standards) • Microsoft security stack expertise: Defender for Endpoint, Microsoft Sentinel, Azure/Defender for Cloud, KQL queries • Security posture & risk assessment (penetration test review, remediation strategies) • Strong analytical and problem-solving capabilities. • Effective communication and collaboration across technical and non-technical teams. Benefits/Other Compensation This position is a contract/temporary role where Hays offers you the opportunity to enroll in full medical benefits, dental benefits, vision benefits, 401K and Life Insurance ($20,000 benefit). Why Hays? You will be working with a professional recruiter who has intimate knowledge of the industry and market trends. Your Hays recruiter will lead you through a thorough screening process in order to understand your skills, experience, needs, and drivers. You will also get support on resume writing, interview tips, and career planning, so when there's a position you really want, you're fully prepared to get it. Nervous about an upcoming interview? Unsure how to write a new resume? Visit the Hays Career Advice section to learn top tips to help you stand out from the crowd when job hunting. Hays is committed to building a thriving culture of diversity that embraces people with different backgrounds, perspectives, and experiences. We believe that the more inclusive we are, the better we serve our candidates, clients, and employees. We are an equal employment opportunity employer, and we comply with all applicable laws prohibiting discrimination based on race, color, creed, sex (including pregnancy, sexual orientation, or gender identity), age, national origin or ancestry, physical or mental disability, veteran status, marital status, genetic information, HIV-positive status, as well as any other characteristic protected by federal, state, or local law. One of Hays' guiding principles is ‘do the right thing'. We also believe that actions speak louder than words. In that regard, we train our staff on ensuring inclusivity throughout the entire recruitment process and counsel our clients on these principles. If you have any questions about Hays or any of our processes, please contact us. In accordance with applicable federal, state, and local law protecting qualified individuals with known disabilities, Hays will attempt to reasonably accommodate those individuals unless doing so would create an undue hardship on the company. Any qualified applicant or consultant with a disability who requires an accommodation in order to perform the essential functions of the job should call or text ************. Drug testing may be required; please contact a recruiter for more information. #LI-DNI
    $65-69.2 hourly 22h ago
  • Java Backend Engineer

    Hays 4.8company rating

    Phoenix, AZ jobs

    Java Backend Developer (Vert.X & Spark - Good to Have) We're looking for a strong Java engineer with experience in backend development and web technologies. Vert.X and Apache Spark experience is a plus. Key Skills: Java, Webtechnologies Vert.X & Spark (nice to have) Team player, Agile mindset Hybrid work (3 days onsite)
    $90k-119k yearly est. 3d ago
  • Cloud Engineer

    The Phoenix Group 4.8company rating

    New York, NY jobs

    Cloud Infrastructure Engineer We are seeking a skilled Cloud Infrastructure Engineer to design, implement, and maintain secure, scalable, and resilient cloud infrastructure solutions. The role involves leveraging SaaS and cloud-based technologies to solve complex business challenges and support global operations. Responsibilities Implement and support enterprise-scale cloud solutions and integrations. Build and automate cloud infrastructure using IaC tools such as Terraform, CloudFormation, or ARM templates. Deploy and support Generative AI platforms and cloud-based vendor solutions. Implement and enforce cloud security best practices, including IAM, encryption, network segmentation, and compliance with industry standards. Establish monitoring, logging, and alerting frameworks to ensure high availability and performance. Optimize cost, performance, and reliability of cloud services. Participate in on-call rotations and provide support for cloud infrastructure issues. Maintain documentation, conduct knowledge transfer sessions, and perform design peer reviews. Experience Level 5+ years in cloud infrastructure engineering, preferably in regulated industries. Deep expertise in at least one major cloud platform (Azure, AWS, or GCP). Proficient with Azure and related services (AI/ML tools, security, automation, governance). Familiarity with SIEM, CNAPP, EDR, Zero Trust architecture, and MDM solutions. Experience with SaaS integrations and managing third-party cloud services. Understanding of virtualization, containerization, auto-scaling, and fully automated systems. Experience scripting in PowerShell and Python; working knowledge of REST APIs. Networking knowledge (virtual networks, DNS, SSL, firewalls) and IT change management. Strong collaboration, interpersonal, and communication skills. Willingness to participate in on-call rotations and after-hours support. The Phoenix Group Advisors is an equal opportunity employer. We are committed to creating a diverse and inclusive workplace and prohibit discrimination and harassment of any kind based on race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, genetic information, disability, or veteran status. We strive to attract talented individuals from all backgrounds and provide equal employment opportunities to all employees and applicants for employment.
    $83k-115k yearly est. 1d ago
  • Application Support Engineer

    The Phoenix Group 4.8company rating

    Fairfield, CT jobs

    bout Us We are a global investment firm focused on combining financial theory with practical application. Our goal is to deliver long-term results by cutting through market noise, identifying the most impactful factors, and developing ideas that stand up to rigorous testing. Over the years, we have built a reputation as innovators in portfolio management and alternative investment strategies. Our team values intellectual curiosity, honesty, and a commitment to understanding what drives financial markets. Collaboration, transparency, and openness to new ideas are central to our culture, fostering innovation and continuous improvement. Your Role We are seeking an Application Support Engineer to operate at the intersection of technical systems and business processes that power our investment operations. This individual contributor role involves supporting a complex technical environment, resolving production issues, and contributing to projects that enhance systems and processes. You will gain hands-on experience with cloud-deployed portfolio management and research systems and work closely with both business and technical teams. This role is ideal for someone passionate about technology and systems reliability, looking to grow into a systems reliability or engineering-focused position. Responsibilities Develop and maintain expertise in the organization's applications to support internal users. Manage user expectations and ensure satisfaction with our systems and tools. Advocate for users with project management and development teams. Work closely with QA to report and track issues identified by users. Ensure proper escalation for unresolved issues to maintain user satisfaction. Participate in production support rotations, including off-hours coverage. Identify gaps in support processes and create documentation or workflows in collaboration with development and business teams. Diagnose and resolve system issues, including debugging code, analyzing logs, and investigating performance or resource problems. Collaborate across teams to resolve complex technical problems quickly and efficiently. Maintain documentation of system behavior, root causes, and process improvements. Contribute to strategic initiatives that enhance system reliability and operational efficiency. Qualifications Bachelor's degree in Engineering, Computer Science, or equivalent experience. 2+ years of experience supporting complex software systems, collaborating with business users and technical teams. Hands-on technical skills including SQL and programming/debugging (Python preferred). Strong written and verbal communication skills. Ability to work independently and within small teams. Eagerness to learn new technologies and automate manual tasks to improve system reliability. Calm under pressure and demonstrates responsibility, maturity, and trustworthiness. Compensation & Benefits Salary range: $115,000-$135,000 (may vary based on experience, location, or organizational needs). Eligible for annual discretionary bonus. Comprehensive benefits package including paid time off, medical/dental/vision coverage, 401(k), and other benefits as applicable. The Phoenix Group Advisors is an equal opportunity employer. We are committed to creating a diverse and inclusive workplace and prohibit discrimination and harassment of any kind based on race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, genetic information, disability, or veteran status. We strive to attract talented individuals from all backgrounds and provide equal employment opportunities to all employees and applicants for employment.
    $115k-135k yearly 3d ago
  • Data Engineer II

    Capital Rx 4.1company rating

    Remote

    About Judi Health Judi Health is an enterprise health technology company providing a comprehensive suite of solutions for employers and health plans, including: Capital Rx, a public benefit corporation delivering full-service pharmacy benefit management (PBM) solutions to self-insured employers, Judi Health™, which offers full-service health benefit management solutions to employers, TPAs, and health plans, and Judi , the industry's leading proprietary Enterprise Health Platform (EHP), which consolidates all claim administration-related workflows in one scalable, secure platform. Together with our clients, we're rebuilding trust in healthcare in the U.S. and deploying the infrastructure we need for the care we deserve. To learn more, visit **************** Location: Remote (For Non-Local) or Hybrid (Local to NYC area or Denver, CO) Position Summary: We are seeking a highly motivated and talented Data Engineer to join our team and play a critical role in shaping the future of healthcare data management. This individual will be a key contributor in building robust, scalable, and accurate data systems that empower operational and analytics teams to make informed decisions and drive positive outcomes. Position Responsibilities: Lead relationship with operational and analytics teams to translate business needs into effective data solutions Architect and implement ETL workflows leveraging CapitalRx platforms and technologies such as Python, dbt, SQLAlchemy, Terraform, Airflow, Snowflake, and Redshift Conduct rigorous testing to ensure the flawless execution of data pipelines before production deployment Identify, recommend, and implement process improvement initiatives. Proactively identify and resolve data-related issues, ensuring system reliability and data integrity Lead moderately complex projects. Provide ongoing maintenance and support for critical data infrastructure, including 24x7 on-call availability Responsible for adherence to the Capital Rx Code of Conduct including reporting of noncompliance. Required Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field 2+ experience working with Airflow, dbt, and Snowflake Expertise in data warehousing architecture techniques and familiarity with Kimball methodology Minimum 3+ years experience with a proven track record as a Data Engineer, displaying the ability to design, implement, and maintain complex data pipelines 1+ year experience in Python, SQL Capacity to analyze the company's broader data landscape and architect scalable data solutions that support growth Excellent communication skills to collaborate effectively with both technical and non-technical stakeholders A self-motivated and detail-oriented individual with the ability to tackle and solve intricate technical challenges Preferred Qualifications: 1-3 years of experience as a Data Engineer, ideally in the healthcare or PBM sector Advanced proficiency with Airflow, dbt, and Snowflake, coupled with 3+ years of SQL development and Python experience This range represents the low and high end of the anticipated base salary range for the NY-based position. The actual base salary will depend on several factors such as experience, knowledge, and skills, and if the location of the job changes. This position description is designed to be flexible, allowing management the opportunity to assign or reassign duties and responsibilities as needed to best meet organizational goals. Salary Range$120,000-$140,000 USD All employees are responsible for adherence to the Capital Rx Code of Conduct including the reporting of non-compliance. This position description is designed to be flexible, allowing management the opportunity to assign or reassign duties and responsibilities as needed to best meet organizational goals. Judi Health values a diverse workplace and celebrates the diversity that each employee brings to the table. We are proud to provide equal employment opportunities to all employees and applicants for employment and prohibit discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, medical condition, genetic information, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. By submitting an application, you agree to the retention of your personal data for consideration for a future position at Judi Health. More details about Judi Health's privacy practices can be found at *********************************************
    $120k-140k yearly Auto-Apply 12d ago
  • Data Engineer (Remote, Continental United States)

    ICA.Ai 4.7company rating

    Arlington, VA jobs

    About ICA, Inc. International Consulting Associates, Inc. is a rapidly growing company, located in the D.C./Metro area. We were founded in 2009 to assist government clients with evaluating and achieving their objectives. We have become a trusted advisor helping our clients by offering cutting-edge innovation and solutions to complex projects. Our small company has grown significantly, and we're overjoyed at the opportunity to expand yet again! We are results-focused and have a proven track record supporting federal agencies and large government services primes in three main areas: Research and Data Analysis, Advanced-Data Science, and Strategic Services. We currently support multiple analytics and research programs across HHS. At ICA, we believe our success starts with our people. We foster a collaborative "one team" environment where work-life balance isn't just talked about - it's prioritized. We're building dynamic, highly skilled teams in a welcoming and supportive atmosphere. If you're passionate about using your technical expertise to make a difference, we want to talk to you. We are looking for a Data Engineer to join our growing team! ABOUT THE ROLE: We are seeking an experienced Data Engineer to build and maintain data infrastructure supporting ICA's federal agency clients, including the FDA. You'll develop pipelines and platforms that transform raw data into actionable insights, working with analysts, data scientists, and developers in an agile environment. ABOUT YOU: As a data engineer with software development expertise you bring a blend of analytical rigor and coding craftsmanship to every project. You excel at designing scalable data pipelines, optimizing performance, and ensuring data integrity across complex systems. Your strong programming skills allow you to build robust tools and services that empower data-driven decision-making. You collaborate seamlessly with cross-functional teams, translating business needs into technical solutions with clarity and precision. Above all, you are passionate about continuous learning and innovation, always seeking ways to improve systems and deliver value. RESPONSIBILITIES: Design and maintain scalable data pipelines and ETL/ELT processes Build document processing pipelines for text and image extraction Architect AWS-based data solutions using S3, Glue, Redshift, RDS, ECS, etc Optimize SQL queries and develop Python-based data processing workflows Troubleshoot data pipeline issues and implement solutions Ensure data pipeline performance, scalability, and security REQUIRED QUALIFICATIONS: 4+ years of experience working with ETL, Data Modeling, and Data Architecture Expertise in writing and optimizing SQL Experience with Big Data technologies such as Spark Intermediate Linux skills Experience in managing large data warehouses or data lakes Minimum of 1 year experience with programming in Python Experience with data and cloud engineering Knowledge of cloud computing services Bachelor's degree, or higher Ability to obtain a Public Trust Clearance PREFERRED QUALIFICATIONS: Databricks Lakehouse platform experience ML pipeline or graph algorithm implementation Unstructured data processing expertise BENEFITS: We invest in our team members so you can live your best life professionally and personally, offering a competitive salary and benefits. Health Insurance -100% employer-paid premiums - ICA covers the full cost of one of three offered medical plans Dental Insurance Vision insurance Health Spending Account Flexible Spending Account Life and Disability insurance 401(k) plan with company match Paid Time Off (Vacation, Sick Leave and Holidays) Education and Professional Development Assistance Remote work from anywhere within the continental United States LOCATION & TELEWORK This is a remote position. Candidates residing in the DMV area preferred. ADDITIONAL INFORMATION: ICA is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender, gender identity or expression, national origin, genetics, disability status, protected veteran status, age, or any other characteristic protected by state, federal or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
    $87k-117k yearly est. 60d+ ago
  • Principal Data Scientist

    Servicelink 4.7company rating

    Remote

    At ServiceLink, we believe in pushing the limits of what's possible through innovation. We're looking for a high-achieving AI enthusiast to lead ground-breaking initiatives that redefine our industry. As our Principal Data Scientist, you'll harness cutting-edge technologies-from advanced machine learning and deep learning to generative AI, Large Language Models, and Agentic AI-to create production-ready systems that solve real-world challenges. This is your opportunity to shape strategy, mentor top talent, and turn ambitious ideas into transformative solutions in an environment that champions bold thinking and continuous innovation. Applicants must be currently authorized to work in the United States on a full-time basis and must not require sponsorship for employment visa status now or in the future. A DAY IN THE LIFE In this role, you will… Transform complex business challenges into innovative AI solutions that leverage deep learning, LLMs, and autonomous Agentic AI frameworks. Lead projects end-to-end-from ideation and data gathering to model design, fine-tuning, deployment, and continuous improvement using full MLOps practices. Collaborate closely with business stakeholders, Data Engineering, Product, and Infrastructure teams to ensure our AI solutions are powerful, secure, and scalable. Drive both research and production by designing experiments, publishing state-of-the-art work in high-impact journals, and protecting strategic intellectual property. Mentor and inspire our next generation of data scientists, sharing insights on emerging trends and best practices in AI. WHO YOU ARE You possess … A visionary leader with an advanced degree (Master's or Ph.D.) in Computer Science, Engineering, or a related field, backed by 10+ years of progressive experience in AI and data science. A technical powerhouse with a solid track record in statistical analysis, machine learning, deep learning, and building production-grade models using transformer architectures and Agentic AI systems. Proficient in Python-and comfortable with other modern programming environments-armed with real-world experience in cloud platforms (preferably Microsoft Azure) and end-to-end AI development (CRISP-DM and ML-Ops). An exceptional communicator who can distill complex technical ideas into strategic insights for diverse audiences, from the boardroom to the lab. A proactive problem solver and collaborative team player who thrives in a fast-paced, interdisciplinary setting, ready to balance innovative risk with practical execution. Responsibilities Strategize with leadership and stakeholders to align AI innovations with business objectives-identifying risks, seizing opportunities, and driving measurable outcomes. Architect and lead the development of next-generation AI solutions, with a special focus on Agentic AI, deep learning models, and transformer-based LLMs. Build automated MLOps pipelines to ensure continuous integration, deployment, and monitoring of models across diverse data environments. Act as both a thought leader and an active contributor-publishing in high-impact journals, representing ServiceLink at industry events, and safeguarding our IP. Collaborate cross-functionally to ensure our AI systems are secure, scalable, and cost-effective, continuously refining them based on rigorous performance metrics Mentor and empower your peers, fostering a culture of innovation, resilience, and learning. All other duties as assigned. Qualifications Advanced degree (Master's or Ph.D.) in Computer Science, Engineering, or a related quantitative discipline, backed by 10+ years of relevant industry experience. Demonstrated expertise in Python and practical experience deploying advanced ML/AI solutions-including deep learning, LLMs, and Agentic AI-in production environments. Proficiency with modern cloud platforms (preferably Microsoft Azure) and a proven record of operationalizing AI via MLOps best practices. Strong ability to balance innovation with practicality, evaluating technical capabilities versus business and cost considerations. Excellent communicator with a knack for translating intricate technical strategies into clear, actionable plans. A collaborative mindset with a history of mentoring teams and building high-impact technology solutions. We can recommend jobs specifically for you! Click here to get started.
    $80k-113k yearly est. Auto-Apply 26d ago
  • ETL Architect

    Healthplan Services 4.7company rating

    Tampa, FL jobs

    HealthPlan Services (HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries. Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide. Job Description Position: ETL Architect The ETL Architect will have experience delivering BI solutions with an Agile BI delivery methodology. Essential Job Functions and Duties: Develop and maintain ETL jobs for data warehouses/marts Design ETL via source-to-target mapping and design documents that consider security, performance tuning and best practices Collaborate with delivery and technical team members on design and development Collaborate with business partners to understand business processes, underlying data and reporting needs Conduct data analysis in support of ETL development and other activities Assist with data architecture and data modeling Preferred Qualifications: 12+ years of work experience as Business Intelligence Developer Work experience with multiple database platforms and BI delivery solutions 10+ years of experience with End to End ETL architecture, data modeling BI and Analytics data marts, implementing and supporting production environments. 10+ years of experience designing, building and implementing BI solutions with modern BI tools like Microstrategy, Microsoft and Tableau Experience as a Data Architect Experience delivering BI solutions with an Agile BI delivery methodology Ability to communicate, present and interact comfortably with senior leadership Demonstrated proficiency implementing self-service solutions to empower an organization to generate valuable actionable insights Strong team player Ability to understand information quickly, derive insight, synthesize information clearly and concisely, and devise solutions Inclination to take initiative, set priorities, take ownership of assigned projects and initiatives, drive for results, and collaborate to achieve greatest value Strong relationship-building and interpersonal skills Demonstrated self-confidence, honesty and integrity Conscientious of Enterprise Data Warehouse Release management process; Conduct Operations readiness and environment compatibility review of any changes prior to deployment with strong sensitivity around Impact and SLA Experience with data modeling tools a plus. Expert in data warehousing methodologies and best practices required. Ability to initiate and follow through on complex projects of both short and long term duration required. Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required. Proactive recommendation for improving the performance and operability of the data warehouse and reporting environment. Participate on interdepartmental teams to support organizational goals Perform other related duties and tasks as assigned Experience facilitating user sessions and gathering requirements Education Requirements: Bachelors or equivalent degree in a business, technical, or related field Additional Information All your information will be kept confidential according to EEO guidelines.
    $84k-105k yearly est. 60d+ ago
  • ETL Architect

    Healthplan Services 4.7company rating

    Tampa, FL jobs

    HealthPlan Services (HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries. Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide. Job Description Position: ETL Architect The ETL Architect will have experience delivering BI solutions with an Agile BI delivery methodology. Essential Job Functions and Duties: Develop and maintain ETL jobs for data warehouses/marts Design ETL via source-to-target mapping and design documents that consider security, performance tuning and best practices Collaborate with delivery and technical team members on design and development Collaborate with business partners to understand business processes, underlying data and reporting needs Conduct data analysis in support of ETL development and other activities Assist with data architecture and data modeling Preferred Qualifications: 12+ years of work experience as Business Intelligence Developer Work experience with multiple database platforms and BI delivery solutions 10+ years of experience with End to End ETL architecture , data modeling BI and Analytics data marts, implementing and supporting production environments. 10+ years of experience designing, building and implementing BI solutions with modern BI tools like Microstrategy, Microsoft and Tableau Experience as a Data Architect Experience delivering BI solutions with an Agile BI delivery methodology Ability to communicate, present and interact comfortably with senior leadership Demonstrated proficiency implementing self-service solutions to empower an organization to generate valuable actionable insights Strong team player Ability to understand information quickly, derive insight, synthesize information clearly and concisely, and devise solutions Inclination to take initiative, set priorities, take ownership of assigned projects and initiatives, drive for results, and collaborate to achieve greatest value Strong relationship-building and interpersonal skills Demonstrated self-confidence, honesty and integrity Conscientious of Enterprise Data Warehouse Release management process; Conduct Operations readiness and environment compatibility review of any changes prior to deployment with strong sensitivity around Impact and SLA Experience with data modeling tools a plus. Expert in data warehousing methodologies and best practices required. Ability to initiate and follow through on complex projects of both short and long term duration required. Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required. Proactive recommendation for improving the performance and operability of the data warehouse and reporting environment. Participate on interdepartmental teams to support organizational goals Perform other related duties and tasks as assigned Experience facilitating user sessions and gathering requirements Education Requirements: Bachelors or equivalent degree in a business, technical, or related field Additional Information All your information will be kept confidential according to EEO guidelines.
    $84k-105k yearly est. 9h ago
  • Senior Data Engineer

    John Hancock 4.4company rating

    Boston, MA jobs

    The Opportunity John Hancock/ Manulife is looking for a technically strong hands-on, collaborative Senior Data Engineer with Informatica experience who can bring thought leadership to the table and can build fabulous experiences for our customers. Work location: Boston - USA (ideally) or Toronto - Canada Work arrangement: Hybrid - 3 days in office, 2 days from home; Remote working arrangement is not available. Position Responsibilities: Builds, codes, tests, maintain high quality software Participates in Agile sprints and ceremonies; supports rapid iteration and development Design, develop, and implement data integration solutions using Informatica PowerCenter to support business requirements. Collaborate with business analysts and stakeholders to understand data needs and translate them into technical specifications. Develop and maintain data mappings, transformations, and workflows within Informatica PowerCenter. Optimize and tune ETL processes for efficiency and performance. Manage and maintain database systems, ensuring data integrity and security. Develop scripts for automation and improved data processing. Troubleshoot and resolve issues related to data integration and ETL processes. Document technical specifications and maintain comprehensive records of development activities. Participate in code reviews to ensure best practices and quality standards are met. Work with cross-functional teams to support data-related projects and initiatives. Experience in Java is preferred but not required. Performs various investigative "Spikes" in order to mitigate technical uncertainty and risk Updates progress daily through the tracking tool (Jira) or Kanban board Completes and ensures completion of any required documentation required e.g. deployment, maintenance, support and business needs Participate in the weekly Look Ahead meetings to assist the Product Owner to refine the Product Backlog including providing initial estimates Apply disciplined coding practices to enable agility and delivery of high quality code Required Qualifications: Minimum 8 years' experience with IT experience Proven experience of over 5 Years as an Informatica Developer, with expertise in Informatica PowerCenter 10.5 and up. Strong understanding of database systems, preferably with experience in DBMS such as DB2. Proficiency in scripting languages such as Python, Shell, or Perl. Experience with data modeling, data warehousing, and data integration concepts. Experience with performance tuning and optimization of ETL processes. Preferred Qualifications: Bachelor's Degree or equivalent in Computer Science When you join our team: We'll empower you to learn and grow the career you want. We'll recognize and support you in a flexible environment where well-being and inclusion are more than just words. As part of our global team, we'll support you in shaping the future you want to see. #LI-JH About Manulife and John Hancock Manulife Financial Corporation is a leading international financial services provider, helping people make their decisions easier and lives better. To learn more about us, visit ************************************************* Manulife is an Equal Opportunity Employer At Manulife/John Hancock, we embrace our diversity. We strive to attract, develop and retain a workforce that is as diverse as the customers we serve and to foster an inclusive work environment that embraces the strength of cultures and individuals. We are committed to fair recruitment, retention, advancement and compensation, and we administer all of our practices and programs without discrimination on the basis of race, ancestry, place of origin, colour, ethnic origin, citizenship, religion or religious beliefs, creed, sex (including pregnancy and pregnancy-related conditions), sexual orientation, genetic characteristics, veteran status, gender identity, gender expression, age, marital status, family status, disability, or any other ground protected by applicable law. It is our priority to remove barriers to provide equal access to employment. A Human Resources representative will work with applicants who request a reasonable accommodation during the application process. All information shared during the accommodation request process will be stored and used in a manner that is consistent with applicable laws and Manulife/John Hancock policies. To request a reasonable accommodation in the application process, contact ************************. Referenced Salary Location Boston, Massachusetts Working Arrangement Hybrid Salary range is expected to be between $104,860.00 USD - $194,740.00 USD If you are applying for this role outside of the primary location, please contact ************************ for the salary range for your location. The actual salary will vary depending on local market conditions, geography and relevant job-related factors such as knowledge, skills, qualifications, experience, and education/training. Employees also have the opportunity to participate in incentive programs and earn incentive compensation tied to business and individual performance. Manulife/John Hancock offers eligible employees a wide array of customizable benefits, including health, dental, mental health, vision, short- and long-term disability, life and AD&D insurance coverage, adoption/surrogacy and wellness benefits, and employee/family assistance plans. We also offer eligible employees various retirement savings plans (including pension/401(k) savings plans and a global share ownership plan with employer matching contributions) and financial education and counseling resources. Our generous paid time off program in the U.S. includes up to 11 paid holidays, 3 personal days, 150 hours of vacation, and 40 hours of sick time (or more where required by law) each year, and we offer the full range of statutory leaves of absence. Know Your Rights I Family & Medical Leave I Employee Polygraph Protection I Right to Work I E-Verify Company: John Hancock Life Insurance Company (U.S.A.)
    $104.9k-194.7k yearly Auto-Apply 9d ago
  • Data Engineer

    John Hancock 4.4company rating

    Boston, MA jobs

    What distinguishes this opportunity? At John Hancock Manulife, we are devoted to nurturing an environment where your skills and ambitions can prosper. As a Data Engineer, you will have a key role in constructing and managing our data systems and architecture. This versatile role, situated in Boston, MA, delivers an optimum mix of hybrid flexibility and collaborative in-person interactions, empowering you to succeed personally and professionally. This role is located in Boston and has a Hybrid Model of 3 days in the office. As a Data Engineer at John Hancock Manulife, you would play a meaningful role in development and maintenance of the company's data systems and architecture. You would collaborate with other highly skilled data engineers, data architects and governance team in designing, developing, and deploying data pipelines, ETL processes, and other data integration solutions to extract, transform, and load data from various sources into Manulife's data warehouse. Position Responsibilities Build, develop, and maintain scalable ETL/ELT pipelines. Develop and uphold the company's data architecture, including data models, schemas, and data dictionaries. Integrate data from diverse sources, ensuring quality and consistency, and prepare it for analytical and operational uses. Adhere to data governance policies and processes to safeguard data security, privacy, and compliance. Collaborate with data professionals, including data scientists, analysts, and business team members, to comprehend data requirements and devise solutions to meet those needs. Stay informed about the latest tools and technologies, finding opportunities to improve Manulife's data infrastructure. Illustrate the aptitude to learn and apply established methods in data engineering and cloud computing to ensure scalable, efficient, and effective solutions. Required Qualifications 2+ years of background as a Data Engineer. Demonstrable experience in building and operationalizing data pipelines. Hands-on experience with programming languages like Python and SQL. Solid grasp of data warehousing concepts and relational data modeling. Strong working knowledge of SQL queries, Databricks, Synapse, and SQL Server. Preferred Qualifications Familiar with DevOps and CI/CD pipelines in the automation of ETL workloads. Familiar with cloud platforms such as Azure and AWS, along with related technologies. Superb communication and interpersonal skills. Strong analytical, problem-solving, and abilities. Proficiency in data processing, performance analysis, tuning, and resource optimization. Understanding of Agile Scrum methodologies and experience working in an Agile team, including familiarity with collaboration tools such as Teams and JIRA. A dedication to continuous learning from both successes and failures, with an openness to change and improvement. Familiar with production release and operational support models. When you join our team: We'll empower you to learn and grow the career you want. We'll recognize and support you in a flexible environment where well-being and inclusion are more than just words. As part of our global team, we'll support you in shaping the future you want to see. About Manulife and John Hancock Manulife Financial Corporation is a leading international financial services provider, helping people make their decisions easier and lives better. To learn more about us, visit ************************************************* Manulife is an Equal Opportunity Employer At Manulife/John Hancock, we embrace our diversity. We strive to attract, develop and retain a workforce that is as diverse as the customers we serve and to foster an inclusive work environment that embraces the strength of cultures and individuals. We are committed to fair recruitment, retention, advancement and compensation, and we administer all of our practices and programs without discrimination on the basis of race, ancestry, place of origin, colour, ethnic origin, citizenship, religion or religious beliefs, creed, sex (including pregnancy and pregnancy-related conditions), sexual orientation, genetic characteristics, veteran status, gender identity, gender expression, age, marital status, family status, disability, or any other ground protected by applicable law. It is our priority to remove barriers to provide equal access to employment. A Human Resources representative will work with applicants who request a reasonable accommodation during the application process. All information shared during the accommodation request process will be stored and used in a manner that is consistent with applicable laws and Manulife/John Hancock policies. To request a reasonable accommodation in the application process, contact ************************. Referenced Salary Location Boston, Massachusetts Working Arrangement Hybrid Salary range is expected to be between $87,990.00 USD - $163,410.00 USD If you are applying for this role outside of the primary location, please contact ************************ for the salary range for your location. The actual salary will vary depending on local market conditions, geography and relevant job-related factors such as knowledge, skills, qualifications, experience, and education/training. Employees also have the opportunity to participate in incentive programs and earn incentive compensation tied to business and individual performance. Manulife/John Hancock offers eligible employees a wide array of customizable benefits, including health, dental, mental health, vision, short- and long-term disability, life and AD&D insurance coverage, adoption/surrogacy and wellness benefits, and employee/family assistance plans. We also offer eligible employees various retirement savings plans (including pension/401(k) savings plans and a global share ownership plan with employer matching contributions) and financial education and counseling resources. Our generous paid time off program in the U.S. includes up to 11 paid holidays, 3 personal days, 150 hours of vacation, and 40 hours of sick time (or more where required by law) each year, and we offer the full range of statutory leaves of absence. Know Your Rights I Family & Medical Leave I Employee Polygraph Protection I Right to Work I E-Verify Company: John Hancock Life Insurance Company (U.S.A.)
    $88k-163.4k yearly Auto-Apply 25d ago
  • Data Engineer III

    Spring Venture Group 3.9company rating

    Kansas City, MO jobs

    Who We Are: Spring Venture Group is a leading digital direct-to-consumer sales and marketing company with product offerings focused on the senior market. We specialize in distributing Medicare Supplement, Medicare Advantage, and related products via our family of brands and dedicated team of licensed insurance agents. Powered by our unique technologies that combine sophisticated marketing, comparison shopping, sales execution, and customer engagement - we help thousands of seniors across the country navigate the complex world of Medicare every day. Job Description This person has the opportunity to work primarily remote in the Kansas City or surrounding areas, making occasional visits to the office, but must CURRENTLY be in the Kansas City area. We are unable to sponsor for this role, this includes international students. OVERVIEW The Data Management team is responsible for all things data at Spring Venture Group. Most importantly, our team is responsible for constructing high quality datasets that enable our business stakeholders and world-class Analytics department to make data informed decisions. Data engineers, combining Software Engineering and Database Engineering, serve as a primary resource for expertise with writing scripts and SQL queries, monitoring our database stability, and assisting with data governance ensuring availability for business-critical systems. The DE III works with a team of engineers of varying levels to design, develop, test, and maintain software applications and programs. The DE III will be expected to work independently when needed to solve the most complex problems encountered. They will be expected to be a leader and a mentor. ESSENTIAL DUTIES The essential duties for this role include, but are not limited to: Serve as a primary advisor to Data Engineering Manager to identify and bring attention to opportunities for technical improvements, reduction of technical debt, or automation of repeated tasks. Build advanced data pipelines utilizing the medallion architecture to create high quality single source of truth data sources in Snowflake Architect replacements of current Data Management systems with respect to all aspects of data governance Design advanced services with multiple data pipelines to securely and appropriately store company assets in our enterprise data stores. Technically advise any member of the data engineering department, providing direction when multiple paths forward present themselves. Actively participate as a leader in regular team meetings, listening and ensuring that one is assisting others at every chance for growth and development. Write advanced ETL/ELT scripts where appropriate to integrate data of various formats into enterprise data stores. Take ownership (both individually and as part of a team) of services and applications Write complex SQL queries, scripts, and stored procedures to reliably and consistently modify data throughout our organization according to business requirements Collaborate directly and independently with stakeholders to build familiarity, fully understand their needs, and create custom, modular, and reliable solutions to resolve their requests Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Work with Project Managers, Solution Architects, and Software Development teams to build solutions for Company Initiatives on time, on budget, and on value. Independently architect solutions to problems of high complexity, and advise junior and mid-level engineers on problems of medium complexity. Create data pipelines using appropriate and applicable technologies from Amazon Web Services (AWS) to serve the specific needs of the business. Ensure 99.95% uptime of our company's services monitoring data anomalies, batch failures, and our support chat for one week per team cycle from 8am-9pm. Follow and embrace procedures of both the Data Management team and SVG Software Development Life Cycle (SDLC), including obtaining and retaining IT Security Admin III clearance. Support after hours and weekend releases from our internal Software Development teams. Actively participate in code review and weekly technicals with another more senior engineer or manager. Assist departments with time-critical SQL execution and debug database performance problems. ROLE COMPETENCIES The competencies for this role include, but are not limited to: Emotional Intelligence Drive for Results Continuous Improvement Communication Strategic Thinking Teamwork and Collaboration Qualifications POSITION REQUIREMENTS The requirements to fulfill this position are as follows: Bachelor's degree in Computer Science, or a related technical field. 4-7 years of practical production work in Data Engineering. Expertise of the Python programming language. Expertise of Snowflake Expertise of SQL, databases, & query optimization. Must have experience in a large cloud provider such as AWS, Azure, GCP. Advanced at reading code independently and understanding its intent. Advanced at writing readable, modifiable code that solves business problems. Ability to construct reliable and robust data pipelines to support both scheduled and event based workflows. Working directly with stakeholders to create solutions. Mentoring junior and mid-level engineers on best practices in programming, query optimization, and business tact. Additional Information Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: Competitive Compensation Medical, Dental and vision benefits after a short waiting period 401(k) matching program Life Insurance, and Short-term and Long-term Disability Insurance Optional enrollment includes HSA/FSA, AD&D, Spousal/Dependent Life Insurance, Travel Assist and Legal Plan Generous paid time off (PTO) program starting off at 15 days your first year 15 paid Holidays (includes holiday break between Christmas and New Years) 10 days of Paid Parental Leave and 5 days of Paid Birth Recovery Leave Annual Volunteer Time Off (VTO) and a donation matching program Employee Assistance Program (EAP) - health and well-being on and off the job Rewards and Recognition Diverse, inclusive and welcoming culture Training program and ongoing support throughout your Venture Spring Venture Group career Security Responsibilities: Operating in alignment with policies and standards Reporting Security Incidents Completing assigned training Protecting assigned organizational assets Spring Venture Group is an Equal Opportunity Employer
    $75k-98k yearly est. 8h ago
  • Principal Data Scientist

    Servicelink 4.7company rating

    Plano, TX jobs

    At ServiceLink, we believe in pushing the limits of what's possible through innovation. We're looking for a high-achieving AI enthusiast to lead ground-breaking initiatives that redefine our industry. As our Principal Data Scientist, you'll harness cutting-edge technologies-from advanced machine learning and deep learning to generative AI, Large Language Models, and Agentic AI-to create production-ready systems that solve real-world challenges. This is your opportunity to shape strategy, mentor top talent, and turn ambitious ideas into transformative solutions in an environment that champions bold thinking and continuous innovation. Applicants must be currently authorized to work in the United States on a full-time basis and must not require sponsorship for employment visa status now or in the future. A DAY IN THE LIFE In this role, you will… Transform complex business challenges into innovative AI solutions that leverage deep learning, LLMs, and autonomous Agentic AI frameworks. Lead projects end-to-end-from ideation and data gathering to model design, fine-tuning, deployment, and continuous improvement using full MLOps practices. Collaborate closely with business stakeholders, Data Engineering, Product, and Infrastructure teams to ensure our AI solutions are powerful, secure, and scalable. Drive both research and production by designing experiments, publishing state-of-the-art work in high-impact journals, and protecting strategic intellectual property. Mentor and inspire our next generation of data scientists, sharing insights on emerging trends and best practices in AI. WHO YOU ARE You possess … A visionary leader with an advanced degree (Master's or Ph.D.) in Computer Science, Engineering, or a related field, backed by 10+ years of progressive experience in AI and data science. A technical powerhouse with a solid track record in statistical analysis, machine learning, deep learning, and building production-grade models using transformer architectures and Agentic AI systems. Proficient in Python-and comfortable with other modern programming environments-armed with real-world experience in cloud platforms (preferably Microsoft Azure) and end-to-end AI development (CRISP-DM and ML-Ops). An exceptional communicator who can distill complex technical ideas into strategic insights for diverse audiences, from the boardroom to the lab. A proactive problem solver and collaborative team player who thrives in a fast-paced, interdisciplinary setting, ready to balance innovative risk with practical execution. Responsibilities Strategize with leadership and stakeholders to align AI innovations with business objectives-identifying risks, seizing opportunities, and driving measurable outcomes. Architect and lead the development of next-generation AI solutions, with a special focus on Agentic AI, deep learning models, and transformer-based LLMs. Build automated MLOps pipelines to ensure continuous integration, deployment, and monitoring of models across diverse data environments. Act as both a thought leader and an active contributor-publishing in high-impact journals, representing ServiceLink at industry events, and safeguarding our IP. Collaborate cross-functionally to ensure our AI systems are secure, scalable, and cost-effective, continuously refining them based on rigorous performance metrics Mentor and empower your peers, fostering a culture of innovation, resilience, and learning. All other duties as assigned. Qualifications Advanced degree (Master's or Ph.D.) in Computer Science, Engineering, or a related quantitative discipline, backed by 10+ years of relevant industry experience. Demonstrated expertise in Python and practical experience deploying advanced ML/AI solutions-including deep learning, LLMs, and Agentic AI-in production environments. Proficiency with modern cloud platforms (preferably Microsoft Azure) and a proven record of operationalizing AI via MLOps best practices. Strong ability to balance innovation with practicality, evaluating technical capabilities versus business and cost considerations. Excellent communicator with a knack for translating intricate technical strategies into clear, actionable plans. A collaborative mindset with a history of mentoring teams and building high-impact technology solutions.
    $72k-98k yearly est. Auto-Apply 60d+ ago
  • Principal Data Scientist

    Servicelink 4.7company rating

    Plano, TX jobs

    At ServiceLink, we believe in pushing the limits of what's possible through innovation. We're looking for a high-achieving AI enthusiast to lead ground-breaking initiatives that redefine our industry. As our Principal Data Scientist, you'll harness cutting-edge technologies-from advanced machine learning and deep learning to generative AI, Large Language Models, and Agentic AI-to create production-ready systems that solve real-world challenges. This is your opportunity to shape strategy, mentor top talent, and turn ambitious ideas into transformative solutions in an environment that champions bold thinking and continuous innovation. Applicants must be currently authorized to work in the United States on a full-time basis and must not require sponsorship for employment visa status now or in the future. A DAY IN THE LIFE In this role, you will… Transform complex business challenges into innovative AI solutions that leverage deep learning, LLMs, and autonomous Agentic AI frameworks. Lead projects end-to-end-from ideation and data gathering to model design, fine-tuning, deployment, and continuous improvement using full MLOps practices. Collaborate closely with business stakeholders, Data Engineering, Product, and Infrastructure teams to ensure our AI solutions are powerful, secure, and scalable. Drive both research and production by designing experiments, publishing state-of-the-art work in high-impact journals, and protecting strategic intellectual property. Mentor and inspire our next generation of data scientists, sharing insights on emerging trends and best practices in AI. WHO YOU ARE You possess … A visionary leader with an advanced degree (Master's or Ph.D.) in Computer Science, Engineering, or a related field, backed by 10+ years of progressive experience in AI and data science. A technical powerhouse with a solid track record in statistical analysis, machine learning, deep learning, and building production-grade models using transformer architectures and Agentic AI systems. Proficient in Python-and comfortable with other modern programming environments-armed with real-world experience in cloud platforms (preferably Microsoft Azure) and end-to-end AI development (CRISP-DM and ML-Ops). An exceptional communicator who can distill complex technical ideas into strategic insights for diverse audiences, from the boardroom to the lab. A proactive problem solver and collaborative team player who thrives in a fast-paced, interdisciplinary setting, ready to balance innovative risk with practical execution. Responsibilities Strategize with leadership and stakeholders to align AI innovations with business objectives-identifying risks, seizing opportunities, and driving measurable outcomes. Architect and lead the development of next-generation AI solutions, with a special focus on Agentic AI, deep learning models, and transformer-based LLMs. Build automated MLOps pipelines to ensure continuous integration, deployment, and monitoring of models across diverse data environments. Act as both a thought leader and an active contributor-publishing in high-impact journals, representing ServiceLink at industry events, and safeguarding our IP. Collaborate cross-functionally to ensure our AI systems are secure, scalable, and cost-effective, continuously refining them based on rigorous performance metrics Mentor and empower your peers, fostering a culture of innovation, resilience, and learning. All other duties as assigned. Qualifications Advanced degree (Master's or Ph.D.) in Computer Science, Engineering, or a related quantitative discipline, backed by 10+ years of relevant industry experience. Demonstrated expertise in Python and practical experience deploying advanced ML/AI solutions-including deep learning, LLMs, and Agentic AI-in production environments. Proficiency with modern cloud platforms (preferably Microsoft Azure) and a proven record of operationalizing AI via MLOps best practices. Strong ability to balance innovation with practicality, evaluating technical capabilities versus business and cost considerations. Excellent communicator with a knack for translating intricate technical strategies into clear, actionable plans. A collaborative mindset with a history of mentoring teams and building high-impact technology solutions. We can recommend jobs specifically for you! Click here to get started.
    $72k-98k yearly est. Auto-Apply 26d ago
  • Data Engineer (Mid-Junior Level)

    Preferred Risk Insurance 4.1company rating

    Chicago, IL jobs

    Job Description We are seeking the ideal candidate for our Data Engineer opening for our growing insurance company. As a Data Engineer, you will support our development and business teams by designing, building, and maintaining analytical data pipelines and data warehouse solutions. This role is focused on modernizing our enterprise data platform using Azure Data Lake Storage (ADLS), Azure Synapse Analytics, and Azure Data Factory (ADF). The Data Engineer will work closely with senior engineers and analysts to ingest, transform, and model data at scale, supporting our transition from legacy, heterogeneous data systems into a centralized, cloud-based data lake and data warehouse optimized for analytics and reporting. DUTIES & RESPONSIBILITIES: Design and develop data ingestion and transformation pipelines using Azure Data Factory Load, transform, and manage data in Azure Data Lake Storage (ADLS) and Azure Synapse Analytics Contribute to the design and implementation of a centralized cloud data warehouse Apply data warehousing and distributed data processing concepts to support large analytical datasets Assist with data modeling (fact and dimension tables) to support reporting and analytics use cases Support the migration of legacy SQL Server, SSIS, and SSRS data into Azure-based platforms Perform data quality checks and validation to ensure accuracy and consistency Collaborate with analytics, reporting, and application teams to deliver trusted datasets Participate in documentation, testing, and continuous improvement of data pipelines and platform architecture QUALIFICATIONS REQUIRED: 3-5 years of experience in data engineering, data warehousing, or analytics-focused roles Strong SQL skills for data transformation, validation, and analytical querying Solid understanding of data warehousing principles and analytical data models Experience or strong exposure to Azure data services, including: Azure Data Lake Storage (ADLS) Azure Synapse Analytics Azure Data Factory (ADF) Familiarity with ETL / ELT concepts and production data workflows Understanding of distributed data processing concepts and working with large datasets Experience working with structured and semi-structured data Comfortable using AI-assisted development tools (GitHub Copilot, ChatGPT, etc.) Preferred: Experience with Apache Spark or Hadoop Exposure to Databricks, Snowflake, or similar modern data platforms Familiarity with dbt or analytics engineering workflows Knowledge of modern data formats such as Parquet or Delta Experience supporting on-prem to Azure data platform migrations Exposure to Power BI or other analytics and visualization tools Understanding of data governance, metadata, or data quality frameworks Experience working in agile environments or collaborating across technical teams Preferred Risk Insurance Services provides a competitive benefits package to all full- time employees. Following are some of the perks Preferred Risk employees receive: Competitive Salaries Commitment to your Training & Development Medical and Dental and Vision Reimbursement Short Term Disability/Long Term Disability Life Insurance Flexible Spending Account Telemedicine Benefit 401k with a generous company match Paid Time Off and Paid Holidays Tuition Reimbursement Wellness Program Fun company sponsored events And so much more! Estimated Compensation Range: $80,000/year-$100,000/year* *Published ranges are estimates. Offered compensation will be based on experience, skills, education, certifications, and geographic location. In addition, starting salary may vary by position depending on whether the position is in-office, hybrid or remote. Job Posted by ApplicantPro
    $80k-100k yearly 7d ago
  • Data Engineer (Mid-Junior Level)

    Preferred Risk Insurance 4.1company rating

    Bedford Park, IL jobs

    We are seeking the ideal candidate for our Data Engineer opening for our growing insurance company. As a Data Engineer, you will support our development and business teams by designing, building, and maintaining analytical data pipelines and data warehouse solutions. This role is focused on modernizing our enterprise data platform using Azure Data Lake Storage (ADLS), Azure Synapse Analytics, and Azure Data Factory (ADF). The Data Engineer will work closely with senior engineers and analysts to ingest, transform, and model data at scale, supporting our transition from legacy, heterogeneous data systems into a centralized, cloud-based data lake and data warehouse optimized for analytics and reporting. DUTIES & RESPONSIBILITIES: Design and develop data ingestion and transformation pipelines using Azure Data Factory Load, transform, and manage data in Azure Data Lake Storage (ADLS) and Azure Synapse Analytics Contribute to the design and implementation of a centralized cloud data warehouse Apply data warehousing and distributed data processing concepts to support large analytical datasets Assist with data modeling (fact and dimension tables) to support reporting and analytics use cases Support the migration of legacy SQL Server, SSIS, and SSRS data into Azure-based platforms Perform data quality checks and validation to ensure accuracy and consistency Collaborate with analytics, reporting, and application teams to deliver trusted datasets Participate in documentation, testing, and continuous improvement of data pipelines and platform architecture QUALIFICATIONS REQUIRED: 3-5 years of experience in data engineering, data warehousing, or analytics-focused roles Strong SQL skills for data transformation, validation, and analytical querying Solid understanding of data warehousing principles and analytical data models Experience or strong exposure to Azure data services, including: Azure Data Lake Storage (ADLS) Azure Synapse Analytics Azure Data Factory (ADF) Familiarity with ETL / ELT concepts and production data workflows Understanding of distributed data processing concepts and working with large datasets Experience working with structured and semi-structured data Comfortable using AI-assisted development tools (GitHub Copilot, ChatGPT, etc.) Preferred: Experience with Apache Spark or Hadoop Exposure to Databricks, Snowflake, or similar modern data platforms Familiarity with dbt or analytics engineering workflows Knowledge of modern data formats such as Parquet or Delta Experience supporting on-prem to Azure data platform migrations Exposure to Power BI or other analytics and visualization tools Understanding of data governance, metadata, or data quality frameworks Experience working in agile environments or collaborating across technical teams Preferred Risk Insurance Services provides a competitive benefits package to all full- time employees. Following are some of the perks Preferred Risk employees receive: Competitive Salaries Commitment to your Training & Development Medical and Dental and Vision Reimbursement Short Term Disability/Long Term Disability Life Insurance Flexible Spending Account Telemedicine Benefit 401k with a generous company match Paid Time Off and Paid Holidays Tuition Reimbursement Wellness Program Fun company sponsored events And so much more! Estimated Compensation Range: $80,000/year-$100,000/year* *Published ranges are estimates. Offered compensation will be based on experience, skills, education, certifications, and geographic location. In addition, starting salary may vary by position depending on whether the position is in-office, hybrid or remote.
    $80k-100k yearly 35d ago

Learn more about W. R. Berkley jobs