Data Science Engineer - Remote (Req. #748)
Data engineer job at Mindex
Founded in 1994 and celebrating 30 years in business, Mindex is a software development company with a rich history of demonstrated software and product development success. We specialize in agile software development, cloud professional services, and creating our own innovative products. We are proud to be recognized as the #1 Software Developer in the 2023 RBJ's Book of Lists and ranked 27th in Rochester Chamber's Top 100 Companies. Additionally, we have maintained our certification as a Great Place to Work for consecutive years in a row. Our list of satisfied clients and #ROCstar employees are both rapidly growing- Are you next to join our team?
Mindex's Software Development division is the go-to software developer for enterprise organizations looking to engage teams of skilled technical resources to help them plan, navigate, and execute through the full software development lifecycle.
We are seeking a highly skilled and motivated Data Science Engineer to join our AI Platform team.
Essential Functions
This role will be pivotal in building and scaling our data-driven products and services. You will transform raw data into actionable intelligence, develop and deploy robust machine learning models, and help establish foundational MLOps workflows on modern cloud infrastructure.
Key Responsibilities:
Design and implement scalable data pipelines to ingest, process, and transform large datasets (structured & unstructured).
Develop, validate, and optimize supervised and unsupervised machine learning models leveraging Python, SQL, and modern libraries.
Conduct feature engineering, model selection, and statistical modeling to deliver high-impact solutions.
Build and expose model APIs or containerized workflows for seamless integration and deployment in production environments.
Apply MLOps best practices to model versioning, testing, monitoring, and deployment.
Work with Big Data technologies such as Databricks and Snowflake to unlock analytics at scale.
Orchestrate complex workflows using tools like Airflow or Dagster for automation and reliability.
Collaborate with AI teams to refine prompt engineering and leverage AI tooling for model fine-tuning and augmentation.
Maintain familiarity with leading cloud platforms (AWS, Azure, GCP) for model training, deployment, and infrastructure management.
Partner with product, engineering, and business teams to translate requirements into technical solutions.
Requirements
Bachelor's or Master's degree in Computer Science, Data Science, Engineering, Statistics, or a related field.
3+ years of experience in data science engineering or related roles.
Proficiency in Python and SQL for data extraction, analysis, and modeling.
Strong background in statistical modeling and machine learning algorithms (supervised and unsupervised).
Experience with feature engineering and end-to-end model development.
Hands-on experience with MLOps foundations (CI/CD, model monitoring, automated retraining).
Familiarity with Big Data tools (Databricks, Snowflake, Spark).
Experience with workflow orchestration platforms such as Airflow or Dagster.
Understanding of cloud architecture and deployment (AWS, Azure, GCP).
Experience deploying models as APIs or containers (Docker, FastAPI, Flask).
Familiarity with prompt engineering techniques and AI tooling for cutting-edge model development.
Excellent problem-solving and communication skills.
Preferred
Experience with advanced AI tools (e.g., LLMs, vector databases).
Exposure to data visualization tools and dashboarding.
Knowledge of security, privacy, and compliance in ML workflows.
Physical Conditions/Requirements
Prolonged periods sitting at a desk and working on a computer
No heavy lifting is expected. Exertion of up to 10 lbs.
Benefits
Health insurance
Paid holidays
Flexible time off
401k retirement savings plan and company match with pre-tax and ROTH options
Dental insurance
Vision insurance
Employer paid disability insurance
Life insurance and AD&D insurance
Employee assistance program
Flexible spending accounts
Health savings account with employer contributions
Accident, critical illness, hospital indemnity, and legal assistance
Adoption assistance
Domestic partner coverage
Mindex Perks
Tickets to local sporting events
Teambuilding events
Holiday and celebration parties
Professional Development
Leadership training
License to Udemy online training courses
Growth opportunities
The band range for this role takes into account the wide range of factors that are considered in making compensation decisions including, but not limited to, skill sets, education, experience, training, certifications, internal equity, and other business and organizational needs. It is not typical for an individual to be hired at, or near, the top of the range for their role; and compensation decisions are dependent on the facts and circumstances of each case. The range for this role is $90,000-$120,000
Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor, or take over sponsorship of an employment Visa at this time.
Auto-ApplyData Engineer
New York, NY jobs
DL Software produces Godel, a financial information and trading terminal.
Role Description
This is a full-time, on-site role based in New York, NY, for a Data Engineer. The Data Engineer will design, build, and maintain scalable data systems and pipelines. Responsibilities include data modeling, developing and managing ETL workflows, optimizing data storage solutions, and supporting data warehousing initiatives. The role also involves collaborating with cross-functional teams to improve data accessibility and analytics capabilities.
Qualifications
Strong proficiency in Data Engineering and Data Modeling
Mandatory: strong experience in global financial instruments including equities, fixed income, options and exotic asset classes
Strong Python background
Expertise in Extract, Transform, Load (ETL) processes and tools
Experience in designing, managing, and optimizing Data Warehousing solutions
Cloud Data Engineer
New York, NY jobs
Title: Enterprise Data Management - Data Cloud, Senior Developer I
Duration: FTE/Permanent
Salary: 130-165k
The Data Engineering team oversees the organization's central data infrastructure, which powers enterprise-wide data products and advanced analytics capabilities in the investment management sector. We are seeking a senior cloud data engineer to spearhead the architecture, development, and rollout of scalable, reusable data pipelines and products, emphasizing the creation of semantic data layers to support business users and AI-enhanced analytics. The ideal candidate will work hand-in-hand with business and technical groups to convert intricate data needs into efficient, cloud-native solutions using cutting-edge data engineering techniques and automation tools.
Responsibilities:
Collaborate with business and technical stakeholders to collect requirements, pinpoint data challenges, and develop reliable data pipeline and product architectures.
Design, build, and manage scalable data pipelines and semantic layers using platforms like Snowflake, dbt, and similar cloud tools, prioritizing modularity for broad analytics and AI applications.
Create semantic layers that facilitate self-service analytics, sophisticated reporting, and integration with AI-based data analysis tools.
Build and refine ETL/ELT processes with contemporary data technologies (e.g., dbt, Python, Snowflake) to achieve top-tier reliability, scalability, and efficiency.
Incorporate and automate AI analytics features atop semantic layers and data products to enable novel insights and process automation.
Refine data models (including relational, dimensional, and semantic types) to bolster complex analytics and AI applications.
Advance the data platform's architecture, incorporating data mesh concepts and automated centralized data access.
Champion data engineering standards, best practices, and governance across the enterprise.
Establish CI/CD workflows and protocols for data assets to enable seamless deployment, monitoring, and versioning.
Partner across Data Governance, Platform Engineering, and AI groups to produce transformative data solutions.
Qualifications:
Bachelor's or Master's in Computer Science, Information Systems, Engineering, or equivalent.
10+ years in data engineering, cloud platform development, or analytics engineering.
Extensive hands-on work designing and tuning data pipelines, semantic layers, and cloud-native data solutions, ideally with tools like Snowflake, dbt, or comparable technologies.
Expert-level SQL and Python skills, plus deep familiarity with data tools such as Spark, Airflow, and cloud services (e.g., Snowflake, major hyperscalers).
Preferred: Experience containerizing data workloads with Docker and Kubernetes.
Track record architecting semantic layers, ETL/ELT flows, and cloud integrations for AI/analytics scenarios.
Knowledge of semantic modeling, data structures (relational/dimensional/semantic), and enabling AI via data products.
Bonus: Background in data mesh designs and automated data access systems.
Skilled in dev tools like Azure DevOps equivalents, Git-based version control, and orchestration platforms like Airflow.
Strong organizational skills, precision, and adaptability in fast-paced settings with tight deadlines.
Proven self-starter who thrives independently and collaboratively, with a commitment to ongoing tech upskilling.
Bonus: Exposure to BI tools (e.g., Tableau, Power BI), though not central to the role.
Familiarity with investment operations systems (e.g., order management or portfolio accounting platforms).
Data Engineer
New York, NY jobs
Our client is seeking a Data Engineer with hands-on experience in Web Scraping technologies to help build and scale a new scraping capability within their Data Engineering team. This role will work directly with Technology, Operations, and Compliance to source, structure, and deliver alternative data from websites, APIs, files, and internal systems. This is a unique opportunity to shape a new service offering and grow into a senior engineering role as the platform evolves.
Responsibilities
Develop scalable Web Scraping solutions using AI-assisted tools, Python frameworks, and modern scraping libraries.
Manage the full lifecycle of scraping requests, including intake, feasibility assessment, site access evaluation, extraction approach, data storage, validation, entitlement, and ongoing monitoring.
Coordinate with Compliance to review Terms of Use, secure approvals, and ensure all scrapes adhere to regulatory and internal policy guidelines.
Build and support AWS-based data pipelines using tools such as Cron, Glue, EventBridge, Lambda, Python ETL, and Redshift.
Normalize and standardize raw, vendor, and internal datasets for consistent consumption across the firm.
Implement data quality checks and monitoring to ensure the reliability, historical continuity, and operational stability of scraped datasets.
Provide operational support, troubleshoot issues, respond to inquiries about scrape behavior or data anomalies, and maintain strong communication with users.
Promote data engineering best practices, including automation, documentation, repeatable workflows, and scalable design patterns.
Required Qualifications
Bachelor's degree in Computer Science, Engineering, Mathematics, or related field.
2-5 years of experience in a similar Data Engineering or Web Scraping role.
Capital markets knowledge with familiarity across asset classes and experience supporting trading systems.
Strong hands-on experience with AWS services (S3, Lambda, EventBridge, Cron, Glue, Redshift).
Proficiency with modern Web Scraping frameworks (Scrapy, BeautifulSoup, Selenium, Playwright).
Strong Python programming skills and experience with SQL and NoSQL databases.
Familiarity with market data and time series datasets (Bloomberg, Refinitiv) is a plus.
Experience with DevOps/IaC tooling such as Terraform or CloudFormation is desirable.
Data Engineer (Web Scraping technologies)
New York, NY jobs
Title: Data Engineer (Web Scraping technologies)
Duration: FTE/Perm
Salary: 125-190k plus bonus
Responsibilities:
Utilize AI Models, Code, Libraries or applications to enable a scalable Web Scraping capability
Web Scraping Request Management including intake, assessment, accessing sites to scrape, utilizing tools to scrape, storage of scrape, validation and entitlement to users
Fielding Questions from users about the scrapes and websites
Coordinating with Compliance on approvals and TOU reviews
Some Experience building Data pipelines in AWS platform utilizing existing tools like Cron, Glue, Eventbridge, Python based ETL, AWS Redshift
Normalizing/standardizing vendor data, firm data for firm consumption
Implement data quality checks to ensure reliability and accuracy of scraped data
Coordinate with Internal teams on delivery, access, requests, support
Promote Data Engineering best practices
Required Skills and Qualifications:
Bachelor's degree in computer science, Engineering, Mathematics or related field
2-5 experience in a similar role
Prior buy side experience is strongly preferred (Multi-Strat/Hedge Funds)
Capital markets experience is necessary with good working knowledge of reference data across asset classes and experience with trading systems
AWS cloud experience with commons services (S3, lambda, cron, Event Bridge etc.)
Experience with web-scraping frameworks (Scrapy, BeautifulSoup, Selenium, Playwright etc.)
Strong hands-on skills with NoSQL and SQL databases, programming in Python, data pipeline orchestration tools and analytics tools
Familiarity with time series data and common market data sources (Bloomberg, Refinitiv etc.)
Familiarity with modern Dev Ops practices and infrastructure-as-code tools (e.g. Terraform, CloudFormation)
Strong communication skills to work with stakeholders across technology, investment, and operations teams.
Sr. Azure Data Engineer
New York, NY jobs
We are
At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron's progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honored with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,500+, and has 58 offices in 21 countries within key global markets.
Our challenge
We are looking for a candidate will be responsible for designing, implementing, and managing data solutions on the Azure platform in Financial / Banking domain.
Additional Information*
The base salary for this position will vary based on geography and other factors. In accordance with law, the base salary for this role if filled within New York City, NY is $130k - $140k/year & benefits (see below).
The Role
Responsibilities:
Lead the development and optimization of batch and real-time data pipelines, ensuring scalability, reliability, and performance.
Architect, design, and deploy data integration, streaming, and analytics solutions leveraging Spark, Kafka, and Snowflake.
Ability to help voluntarily and proactively, and support Team Members, Peers to deliver their tasks to ensure End-to-end delivery.
Evaluates technical performance challenges and recommend tuning solutions.
Hands-on knowledge of Data Service Engineer to design, develop, and maintain our Reference Data System utilizing modern data technologies including Kafka, Snowflake, and Python.
Requirements:
Proven experience in building and maintaining data pipelines, especially using Kafka, Snowflake, and Python.
Strong expertise in distributed data processing and streaming architectures.
Experience with Snowflake data warehouse platform: data loading, performance tuning, and management.
Proficiency in Python scripting and programming for data manipulation and automation.
Familiarity with Kafka ecosystem (Confluent, Kafka Connect, Kafka Streams).
Knowledge of SQL, data modelling, and ETL/ELT processes.
Understanding of cloud platforms (AWS, Azure, GCP) is a plus.
Domain Knowledge in any of the below area:
Trade Processing, Settlement, Reconciliation, and related back/middle-office functions within financial markets (Equities, Fixed Income, Derivatives, FX, etc.).
Strong understanding of trade lifecycle events, order types, allocation rules, and settlement processes.
Funding Support, Planning & Analysis, Regulatory reporting & Compliance.
Knowledge of regulatory standards (such as Dodd-Frank, EMIR, MiFID II) related to trade reporting and lifecycle management.
We offer:
A highly competitive compensation and benefits package.
A multinational organization with 58 offices in 21 countries and the possibility to work abroad.
10 days of paid annual leave (plus sick leave and national holidays).
Maternity & paternity leave plans.
A comprehensive insurance plan including medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region).
Retirement savings plans.
A higher education certification policy.
Commuter benefits (varies by region).
Extensive training opportunities, focused on skills, substantive knowledge, and personal development.
On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses.
Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups.
Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms.
A flat and approachable organization.
A truly diverse, fun-loving, and global work culture.
S YNECHRON'S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Lead Data Engineer with Banking
New York, NY jobs
We are
At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron's progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honored with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,500+, and has 58 offices in 21 countries within key global markets.
Our challenge
We are seeking an experienced Lead Data Engineer to spearhead our data infrastructure initiatives. The ideal candidate will have a strong background in building scalable data pipelines, with hands-on expertise in Kafka, Snowflake, and Python. As a key technical leader, you will design and maintain robust streaming and batch data architectures, optimize data loads in Snowflake, and drive automation and best practices across our data platform.
Additional Information*
The base salary for this position will vary based on geography and other factors. In accordance with law, the base salary for this role if filled within New York, NY is $135k - $140k/year & benefits (see below).
The Role
Responsibilities:
Design, develop, and maintain reliable, scalable data pipelines leveraging Kafka, Snowflake, and Python.
Lead the implementation of distributed data processing and real-time streaming solutions.
Manage Snowflake data warehouse environments, including data loading, tuning, and optimization for performance and cost-efficiency.
Develop and automate data workflows and transformations using Python scripting.
Collaborate with data scientists, analysts, and stakeholders to translate business requirements into technical solutions.
Monitor, troubleshoot, and optimize data pipelines and platform performance.
Ensure data quality, governance, and security standards are upheld.
Guide and mentor junior team members and foster best practices in data engineering.
Requirements:
Proven experience in building and maintaining data pipelines, especially using Kafka, Snowflake, and Python.
Strong expertise in distributed data processing frameworks and streaming architectures.
Hands-on experience with Snowflake data warehouse platform, including data ingestion, performance tuning, and management.
Proficiency in Python for data manipulation, automation, and scripting.
Familiarity with Kafka ecosystem tools such as Confluent, Kafka Connect, and Kafka Streams.
Solid understanding of SQL, data modeling, and ETL/ELT processes.
Knowledge of cloud platforms (AWS, Azure, GCP) is advantageous.
Strong troubleshooting skills and ability to optimize data workflows.
Excellent communication and collaboration skills.
Preferred, but not required:
Bachelor's or Master's degree in Computer Science, Information Systems, or related field.
Experience with containerization (Docker, Kubernetes) is a plus.
Knowledge of data security best practices and GDPR compliance.
Certifications related to cloud platforms or data engineering preferred.
We offer:
A highly competitive compensation and benefits package.
A multinational organization with 58 offices in 21 countries and the possibility to work abroad.
10 days of paid annual leave (plus sick leave and national holidays).
Maternity & paternity leave plans.
A comprehensive insurance plan including medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region).
Retirement savings plans.
A higher education certification policy.
Commuter benefits (varies by region).
Extensive training opportunities, focused on skills, substantive knowledge, and personal development.
On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses.
Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups.
Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms.
A flat and approachable organization.
A truly diverse, fun-loving, and global work culture.
SYNECHRON'S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Data Architect
Albany, NY jobs
For more details, please connect with Afra Aleem at ************ Ext 112 or email her at *******************
Job Title: Data Architect
Duration: 30 Months
Contract
Interview Type: Webcam
Required Skills:
72- months Data Modeling experience in designing underwriting, claims DataMarts, Data warehouse in Insurance industry (Workers' Compensation and disability).
72-months experience with developing maintaining predictive systems written R or Python using TIBCO Spotfire/TERR.
72-months experience with TIBCO Spotfire Analyst, Web Player and Business Authors, using Iron Python, R, and Java Scripts.
72-months experience in developing, documenting, maintaining end-to-end ETL data pipeline (Information Builder International “IBI” Data Migrator aka TIBCO Data Migrator).
72-months experience in Oracle PL/SQL development, including triggers, stored procedures, packages, and advanced query optimization, database design and implementation.
72-months experience in Agile/scrum project management methodology
72-months experience in automated testing, build and deployment tools (TFS, PowerShell and Visual Build Pro).
72-months experience in development tools, such as: VS Code, Visual Studio Code, Visual Studio, Git, Azure, and TFS
72- months experience developing systems in workers' compensation, insurance for WC, DBL, PFL, PBM accounting and financial reporting using westren and eastren method.
V Group Inc. is a New Jersey-based IT Services and Products company, strategically organized into multiple business units: Public Sector, Enterprise Solutions, Ecommerce, and Digital. Within our Public Sector unit, we specialize in delivering IT Professional Services to Federal, State, and Local governments. We hold multiple contracts across 30+ states across US, which include NY, CA, FL, GA, MD, MI, NC, OH, OR, CO, CT, TN, PA, TX, VA, MN, NM, VT, and WA. If you're considering a career opportunity with V Group or exploring a partnership, I welcome you to reach out to me with any questions about our services and the unique advantages we offer to consultants. And please feel free to share my contact information with others who may benefit from connecting with us.
Website: **************************************
LinkedIn: *****************************************
Facebook: *********************************
Twitter: *********************************
Machine Learning Engineer / Data Scientist / GenAI
New York, NY jobs
NYC NY / Hybrid
12+ Months
Project - Leveraging Llama to extract cybersecurity insights out of unstructured data from their ticketing system.
Must have strong experience with:
Llama
Python
Hadoop
MCP
Machine Learning (ML)
They need a strong developer - using llama and Hadoop (this is where the data sits), experience with MCP. They have various ways to pull the data out of their tickets but want someone who can come in and make recommendations on the best way to do it and then get it done. They have tight timelines.
Thanks and Regards!
Lavkesh Dwivedi
************************
Amtex System Inc.
28 Liberty Street, 6th Floor | New York, NY - 10005
************
********************
Software Engineer (Remote)
Chicago, IL jobs
Remote (proximity to Chicago, Nashville or Manhattan would be a big plus)
Regular travel is not required but will need to travel to corporate office 2 times a year
Our client is looking to add a Software Developer that will be responsible for designing, developing, and maintaining high-quality software solutions that support the Firm's digital platforms. This role ensures the stability, scalability, and performance of all applications and services, while collaborating with cross-functional teams to drive continuous improvement in development practices and operational efficiency.
Responsibilities
Design and implement stable, scalable, and extensible software solutions.
Ensure adherence to secure software development lifecycle (SDLC) best practices and standards.
Drive the design and development of services and applications to meet defined service level agreements (SLAs).
Work closely with end users and stakeholders to gather requirements and iterate on solutions that deliver business value.
Proactively identify and resolve any obstacles affecting operational efficiency and service continuity.
Provide ongoing support for developed applications and services, ensuring timely issue resolution.
Participate in the Firm's change and incident management processes, adhering to established protocols.
Software Development & Architecture
Develop and maintain features for web-enabled applications using C# .NET Core.
Write clean, scalable code with a focus on maintainability and performance.
Implement robust, efficient SQL-based solutions, preferably using MS SQL.
Develop and maintain user interfaces using modern frameworks, preferably Angular or Blazor.
Ensure solutions are designed with an emphasis on security, efficiency, and optimization.
Contribute to continuous integration and continuous delivery (CI/CD) pipelines, automating processes where possible.
Collaboration & Optimization
Collaborate closely with business analysts, quality assurance, and other developers to ensure solutions meet both functional and non-functional requirements.
Foster a culture of positive, open communication across diverse teams, with a focus on collaboration and shared goals.
Engage in regular reviews and feedback sessions to drive continuous improvement in development processes and practices.
Provide mentorship and guidance to junior developers where appropriate, supporting their professional growth.
Professional Conduct
Demonstrates commitment to the firm's core values, including Accountability, Integrity, Excellence, Grit, and Love.
Ensures all activities align with business objectives and project timelines.
Communicates effectively, openly exchanging ideas and listening with consideration.
Maintains a proactive, solution-oriented mindset when addressing challenges.
Takes ownership of responsibilities and holds others accountable for their contributions.
Continuously seeks opportunities to optimize processes, improve performance, and drive innovation.
Qualifications
1-3+ years of expertise in C# .NET Core development
Competence in SQL, preferably MS SQL
Competence in UI work, preferably Angular and/or Blazor
Strong structured problem-solving skills, with a history of using systematic and fact-based processes to improve mission-critical services.
A focus on optimization and efficiency in processes.
Experience working in a financial services firm would be a big plus
Demonstrated expertise in fostering a culture of positive collaboration among cross-functional teams with diverse personalities, skill sets, and levels of experience.
Highly developed communication skills
A sense of urgency and a bias for action.
For all non-bonus, non-commission direct hire positions: The anticipated salary range for this position is ($95,000 - $120,000). Actual salary will be based on a variety of factors including relevant experience, knowledge, skills and other factors permitted by law. A range of medical, dental, vision, retirement, paid time off, and/or other benefits are available.
Al/ML Engineer
New York, NY jobs
We are seeking an experienced Machine Learning Engineer with 7+ years of hands-on expertise in developing and deploying ML models. The ideal candidate will have strong proficiency in Python, deep knowledge of ML algorithms, and experience in building scalable data pipelines. You will work on cutting-edge projects involving NLP, Deep Learning, and Large Language Models (LLMs), while collaborating with cross-functional teams to deliver impactful AI solutions.
Key Responsibilities:
Design, develop, and deploy Machine Learning models using Python.
Build and optimize data pipelines for high-volume, high-dimensional datasets.
Implement REST API integration for ML models to enable seamless consumption.
Perform data preprocessing using tools like Pandas, NumPy, etc.
Analyze complex datasets to extract insights and improve model performance.
Collaborate with stakeholders to understand business requirements and translate them into ML solutions.
Required Skills:
7+ years of hands-on experience in ML model development and deployment.
Strong proficiency in Python and ML frameworks such as TensorFlow or PyTorch.
Solid understanding of ML algorithms and statistical modeling.
Experience in data preprocessing and feature engineering.
Knowledge of REST API integration for ML models.
Preferred Skills:
Experience with NLP and Deep Learning techniques.
Exposure to Transformers and Large Language Models (LLMs).
Familiarity with Cloud ML platforms (Azure AI, AWS SageMaker).
Experience with SQL/NoSQL databases.
Life at Capgemini:
Capgemini supports all aspects of your well-being throughout the changing stages of your life and career. For eligible employees, we offer:
Flexible work
Healthcare including dental, vision, mental health, and well-being programs
Financial well-being programs such as 401(k) and Employee Share Ownership Plan
Paid time off and paid holidays
Paid parental leave
Family building benefits like adoption assistance, surrogacy, and cryopreservation
Social well-being benefits like subsidized back-up child/elder care and tutoring
Mentoring, coaching and learning programs
Employee Resource Groups
Disaster Relief
Disclaimer:
Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law.
This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship.
Capgemini is committed to providing reasonable accommodations during our recruitment process. If you need assistance or accommodation, please get in touch with your recruiting contact.
Click the following link for more information on your rights as an Applicant **************************************************************************
AI ML Engineer
New York, NY jobs
We are
At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron's progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honored with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,500+, and has 58 offices in 21 countries within key global markets.
Our challenge
We are seeking a highly experienced and innovative Senior AI/ML Engineer with industry expertise to lead the development of scalable machine learning systems. In this pivotal role, there will be architecture and implementation of advanced AI solutions, guide strategic AI initiatives, and collaboration with cross-functional teams to drive innovation. The technical leadership will be instrumental in shaping our AI roadmap and delivering state-of-the-art models and systems.
Additional Information*
The base salary for this position will vary based on geography and other factors. In accordance with law, the base salary for this role if filled within New York, NY is $135k - $145k/year & benefits (see below).
The Role
Responsibilities:
Design, develop, and optimize scalable machine learning systems capable of handling large-scale data and complex models.
Develop advanced statistical models to solve complex business problems.
Lead the deployment of ML models into production environments with robustness and efficiency.
Integrate models seamlessly with REST APIs for application integration.
Provide technical guidance and strategic direction for AI initiatives across teams.
Stay abreast of the latest AI/ML research, especially in NLP, deep learning, and large language models.
Mentor junior team members and promote best practices in AI engineering.
Collaborate with data engineers, software developers, and product teams to align AI solutions with business goals.
Requirements:
Proven experience designing scalable ML systems and architectures.
Strong expertise in advanced statistical modeling techniques.
Deep experience in model development and deployment pipelines.
Proficiency in integrating ML models with REST APIs.
Hands-on experience with cloud ML platforms, such as AWS SageMaker or Azure AI.
Preferred, but not required:
In-depth experience with Natural Language Processing (NLP), deep learning, and transformer-based models.
Demonstrated leadership in formulating and executing AI strategies within organizations.
Knowledge of the latest AI frameworks, including Transformers and Large Language Models (LLMs).
We offer:
A highly competitive compensation and benefits package.
A multinational organization with 58 offices in 21 countries and the possibility to work abroad.
10 days of paid annual leave (plus sick leave and national holidays).
Maternity & paternity leave plans.
A comprehensive insurance plan including medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region).
Retirement savings plans.
A higher education certification policy.
Commuter benefits (varies by region).
Extensive training opportunities, focused on skills, substantive knowledge, and personal development.
On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses.
Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups.
Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms.
A flat and approachable organization.
A truly diverse, fun-loving, and global work culture.
REMOTE - Senior Boomi Developer
Milwaukee, WI jobs
GlobalSource IT is a provider of both contract and direct-hire recruiting services primarily in the ERP area, including D365.
Our client is a well known retail organization that has been around for nearly 150 years. They recently implemented Dynamics 365 and a related need to that project is prompting an additional full-time headcount for a Senior Boomi Developer. In addition to fantastic company stability, the organization also has an outstanding benefit and bonus structure. Please see below for additional details and contact Dave at ************************ with any questions. Thanks for reviewing.
JOB SUMMARY
The Senior Integration Engineer will work with IT and Business team members to design, develop, implement, and support a variety of solutions that facilitate the movement of business data or connection of applications both within the environment and with third party systems. This person should be able to lead a project with a team of IT and business users. The person in this role will work closely with other team members to deliver on solution designs and production support. The person in this role may also be assigned one or more specific business applications to perform maintenance and to provide internal technical support to our business users and/or partners.
JOB EXPECTATIONS
•
10-15 years of experience with enterprise integration platforms
•
Design and Architect Solutions: Bringing deep knowledge to design stable, reliable, and scalable integration solutions using the Dell Boomi AtomSphere platform and its components (Integration, API Management, MDM, etc.)
•
Hands-on Development: Designing, developing, and implementing complex integration processes, workflows, and APIs (REST/SOAP) to connect various applications (on-premises and cloud-based), ERP systems (like Microsoft Dynamics, Oracle EBS, SAP), and other data sources.
•
Data Transformation: Proficiently handling various data formats such as XML, JSON, CSV and database formats, and using Boomi's capabilities and scripting languages (like Groovy or JavaScript) for complex data mapping and transformations.
•
Dell Boomi Platform Knowledge: Proficiency in Dell Boomi is crucial. Familiarize yourself with Boomi components such as connectors, processes, maps, and APIs. Understand how to design, build, and deploy integrations using Boomi.
•
API Development: Strong knowledge of RESTful and SOAP APIs. You'll create, consume, and manage APIs within Boomi.
•
Troubleshooting Skills: Be adept at diagnosing and resolving integration issues. Familiarity with Boomi's debugging tools is valuable.
•
Security Awareness: Knowledge of authentication methods, encryption, and secure data transmission.
•
Experience and proven track record of implementing integration projects.
•
Extensible Stylesheet Language Transformations (XSLT) experience is a plus.
•
Project Management experience is a plus
•
Experience of ERP systems within a fast-moving wholesale, retail, and Ecommerce environment is highly desirable.
•
Experience of Boomi implementation with Microsoft Dynamics ERP system is a plus.
Software Engineer
New York, NY jobs
Front leaning Full stack Software Engineer role (React, Typescript, Node.js, AWS, data at scale)
100% Remote
Compensation: $170K-$200K + 10% bonus
Full-time W-2 Employment with medical benefits
Client: Late stage (10 years old) Adtech startup - 300+ employees, 65 Engineers
Core Qualifications
Minimum of 10 years experience as a Software Engineer
Must have exposure around Object Oriented Design, Analysis, and Programming in multiple of the following languages: JavaScript, TypeScript, Python, NodeJS, AngularJS, React/React Native, & Vue; as well as knowledge around: API, ORM, Cloud (AWS), SOA, SaaS, messaging, stream processing, and SQL data store technologies.
Must be able to evaluate and modify complex database stored procedures, database structures, and have familiarity with containerization and scaling of SaaS platform services.
Must be able to deep-dive into various applications and data stores to produce meaningful insights, profiling and tracing, operational intelligence, customer experience visualizations, and proactive trend analyses.
Can quickly consume and understand business strategy and operating models; can apply gap analysis techniques to create long-term technical product strategy.
Can ensure technical product and social capabilities match business needs and goals.
Can effectively communicate goals, metrics, and value propositions across the Engineering Organization.
Can facilitate design, development, and support of existing and new products between cross-functional business stakeholders.
Assist team members with problem-solving complex use cases and systems; while leading technical change and transformation in parallel.
Must have knowledge around application system services, communication protocols, and standard industry technologies.
Must be passionate about creating solutions, and solving problems - in the right way, at the right time, and for the right reasons.
Must be teachable, give and receive feedback, and demonstrate success in their discipline on a consistent and transparent basis.
Education
Minimum of 10 years of experience in a product, engineering, development, or technical delivery position.
Bachelor of Science Degree in Computer Science or similar
Senior Dotnet Developer
New York, NY jobs
Application Developer
Qualifications and Requirements:
14+ years of professional software development experience.
Expert proficiency in C# and the .NET / .NET Core framework.
3+ years of experience working specifically with HL7 messaging standards (v2), including detailed knowledge of segments like PID, PV1, OBR, ORC, and message types like ORM (Orders) and ORU (Results).
Demonstrable experience developing and deploying services using ASP.NET Core (Web API, Microservices).
Strong understanding of modern architectural patterns (e.g., Microservices, Event-Driven Architecture).
Proficiency in SQL and experience with SQL Server, including stored procedures and complex query optimization.
Experience with SSIS packages.
Experience with reporting tools such as SSRS, Power BI, or similar platforms.
Familiarity with cloud platforms (preferably Azure, including App Services, Functions, and Service Bus/Event Hub).
Bachelor's degree in computer science or a related field.
EEOE
Staff Data Scientist, AI Products
Remote
Role Description
How many times do you get the opportunity to be on the ground floor of a big and important mission? What if you could be one of the top contributors defining the mission, guiding our teams, and influencing the direction of Dropbox's AI-first journey? As a Staff Data Scientist for this new division, you will get to do exactly that. You will join a team of top-tier data scientists and become an integral part of the product organization, helping to scale this new business.
Joining on the ground floor of this startup team, you'll partner directly with the Head of Data Science and Product, Engineering and Design leadership to shape the product roadmap, foster a top-tier, data-informed culture, and drive real AI/ML impact and execution along the way!
Responsibilities
Partner with Product Engineers and Data Engineers to build the reliable, efficient, and scalable data foundations, tools, and processes to drive our AI/ML capabilities' long-term growth
Leverage data-driven insights to proactively identify most impactful opportunities, and directly influence product roadmaps and strategies
Perform exploratory and deep-dive analysis to understand user workflows and engagement patterns on AI features, propose hypothesis, and design & execute experiments with great rigor and efficient data techniques
Translate complex data insights into implications and recommendations for the business via excellent communication skills, both verbal and written
Identify what matters most and prioritize ruthlessly for the area you will own
Contribute to a culture of strong technical ownership, coach junior Data scientists in the team, and partner with Head of Data Science to keep evolving DS working model and elevate DS impact
Work with cross-functional teams (including Product, Engineering, Design, User Research, and senior executives) to rapidly execute and iterate
Requirements
Bachelors' or above in quantitative discipline: Statistics, Applied Mathematics, Economics, Computer Science, Engineering, or related field
9+ years experience of leveraging data-driven analysis to influence product roadmap and business decision, preferably in a tech company
Proven track record of being able to work independently, driver measurable business impact, and proactively engage with business stakeholders with minimal direction
Proficiency in SQL, Python or other programming/scripting languages
Deep understanding of statistical analysis, experimentation design, and common analytical techniques like regression, decision trees
Ability to provide data insights and recommendations for 0→1 product even when sample size is small
Strong verbal and written communication skills
Preferred Qualifications
Experience in startups or building 0→1 products
Expertise in using data to inform AI/ML product development
Background in SaaS product and growth analytics
Compensation US Zone 1$219,300-$296,700 USDUS Zone 2$197,400-$267,000 USDUS Zone 3$175,400-$237,400 USD
Auto-ApplyStaff Data Scientist
Remote
Role Description
We're looking for a Staff Data Scientist to partner with product, engineering, and design teams to answer key questions and drive impact in the Core Experience and Artificial Intelligence (AI) areas. This area focuses on improving key part of the core product through re-envisioning the home experience, cross platform experience, user onboarding, and building new functionality and launching high impact initiatives. We solve challenging problems and boost business growth through a deep understanding of user behaviors with applied analytics techniques and business insights. An ideal candidate should have robust knowledge of consumer lifecycle, behavior analysis, and customer segmentation. We're looking for someone who can bring opinions and strong narrative framing to proactively influence the business.
Responsibilities
Perform analytical deep-dives to analyze problems and opportunities, identify the hypothesis and design & execute experiments
Inform future experimentation design and roadmaps by performing exploratory analysis to understand user engagement behavior and derive insights
Create personalized segmentation strategies leveraging propensity models to enable targeting of offers and experiences based on user attributes
Identify key trends and build automated reporting & executive-facing dashboards to track the progress of acquisition, monetization, and engagement trends.
Identify opportunities, advocate for new solutions and build momentum cross-functionally to move ideas forward tha tare grounded in data.
Monitor and analyze a high volume of experiments designed to optimize the product for user experience and revenue & promote best practices for multivariate experiments
Translate complex concepts into implications for the business via excellent communication skills, both verbal and written
Understand what matters most and prioritize ruthlessly
Work with cross-functional teams (including Data Science, Marketing, Product, Engineering, Design, User Research, and senior executives) to rapidly execute and iterate
Requirements
Bachelors' or above in quantitative discipline: Statistics, Applied Mathematics, Economics, Computer Science, Engineering, or related field
8+ years experience using analytics to drive key business decisions; examples include business/product/marketing analytics, business intelligence, strategy consulting
Proven track record of being able to work independently and proactively engage with business stakeholders with minimal direction
Significant experience with SQL
Deep understanding of statistical analysis, experimentation design, and common analytical techniques like regression, decision trees
Solid background in running multivariate experiments to optimize a product or revenue flow
Strong verbal and written communication skills
Strong leadership and influence skills
Proficiency in programming/scripting and knowledge of statistical packages like R or Python is a plus
Preferred Qualifications
Product analytics experience in a SAAS company
Masters' or above in a quantitative discipline: Statistics, Applied Mathematics, Economics, Computer Science, Engineering, or related field
Compensation
US Zone 1
This role is not available in Zone 1
US Zone 2$197,400-$267,000 USDUS Zone 3$175,400-$237,400 USD
Auto-ApplyJD: Experience with big data processing and distributed computing systems like Spark. • Implement ETL pipelines and data transformation processes. • Ensure data quality and integrity in all data processing workflows. • Troubleshoot and resolve issues related to PySpark applications and workflows.
• Understand source, dependencies and data flow from converted PySpark code.
• Strong programming skills in Python and SQL.
• Experience with big data technologies like Hadoop, Hive, and Kafka.
• Understanding of data warehousing concepts and relational databases like SQL.
• Demonstrate and document code lineage.
• Integrate PySpark code with frameworks such as Ingestion Framework, DataLens, etc.
, • Ensure compliance with data security, privacy regulations, and organizational standards.
• Knowledge of CI/CD pipelines and DevOps practices.
• Strong problem-solving and analytical skills.
• Excellent communication and leadership abilities.
Auto-ApplyJob Description: 1. 3-5 years in Data platform engineering 2. Experience with CI/CD, laC(Terraform) and containerization with Docker/Kubernetes 3. Hands on experience building backend applications like APIs, Services etc 4. Proven track record in building scalable Data Engineering pipelines using Python, SQL, DBT Core/Cloud.
5.
Experience working with MWAA (Airflow) or similar cloud based Data Engineering Orchestration Tool 6.
Experience working with cloud ecosystems like AWS, Azure or GCP and modern data tools like Snowflake, Databricks etc.
7.
Strong problem solving skills as well as ability to move in a fast pace environment is a plus.
Auto-ApplyAnalytics Data science and IOT Engineer
Remote
Role
Analytics
Data
science
and
IOT
Engineer
Responsibilities
Understanding
the
requirement
and
ability
to
relate
to
statistical
algorithms
Knowing
the
Acceptance
Criteria
and
ways
to
achieve
the
same
Complete
understanding
of
Business
Processes
and
data
Performing
EDA
(Exploratory
Data
Analysis)
cleansing, data preprocessing data munging and create training data sets Using the right Statistical models and other statistical methods Deploying the statistical model using the technology of customers' preference Building Data Pipeline , Machine Learning Pipeline and Monitoring activities are set for Continuous Integration , Continuous Development and Continuous Testing Investigating Statistical Model & provide resolution when there is any data drift and performance issues The Role offers Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development To re-imagine, redesign, and apply technology to add value to the business and operations
Auto-Apply