Data Engineer (12+ Years)
Data Engineer Job 31 miles from Lakewood
Responsibilities:
We are looking for an experienced Data Engineer with a strong foundation in data engineering best practices and experience with modern data warehousing and ETL
platforms.
The ideal candidate will have at least 5 years of hands-on experience, preferably with Snowflake, SQL Server, AWS and Matillion ETL, to build, optimize, and maintain data pipelines and data integration solutions.
This role involves working closely with cross- functional teams to support data-driven initiatives across the organization.Key Responsibilities:Required Qualifications:
Data Pipeline Development and Maintenance:
Design, develop, and maintain scalable and high-performance ETL pipelines using Matillion ETL to ingest, transform, and load data into Snowflake.
Integrate and manage data from various sources, ensuring data quality, reliability, and performance.
Optimize SQL queries, scripts, and stored procedures in Snowflake and SQL Server to improve efficiency and reduce processing time.Data Warehousing and Modeling
Develop and implement data models within Snowflake to support business intelligence and analytics requirements.
Collaborate with data architects and analysts to design and enhance data warehouse schemas and ensure alignment with reporting and analytics needs.Collaboration and Requirements Gathering:
Work with business stakeholders to understand data requirements and translate them into technical specifications.
Support data architects and analysts by providing guidance on data ingestion, storage, and retrieval methods. PlatformMonitoring and Optimization:
Monitor data pipeline and ETL performance, identifying bottlenecks and making necessary adjustments to ensure optimal system performance.
Perform troubleshooting and root cause analysis to resolve data issues and prevent future occurrences.
Required Qualifications:
Experience: At least 5 years of experience in data engineering, with proficiency in Snowflake, SQL Server, AWS and ETL tools (preferably Matillion).
Technical Skills: Strong SQL and Python skills with experience in query optimization and performance tuning.
Education: Bachelor's degree in computer science, Information Technology, or a related field.
Soft Skills: Strong analytical, problem-solving, and communication skills.
If you believe you are qualified for this position and are currently in the job market or interested in making a change, please email me the resume along with contact details at *******************
Big Data Engineer
Data Engineer Job 20 miles from Lakewood
Please consider joining our technology team at Throtle as a Big Data Developer. In this role, you will be part of the team that is responsible for transforming and maintaining several billion records that make Throtle's data onboarding solution work. You'll need to be able to evaluate incoming data, perform specialized hygiene, and standardize it for consumption by multiple processes. You'll be working in a fast-paced, high volume processing environment where quality and attention to the details are paramount.
Duties/Responsibilities
Design and implement data pipelines and transformations using big data technologies such as Spark, Hadoop, and related ecosystems
Develop and maintain complex SQL queries for data extraction and reporting, and optimize query performance and scalability
Design and develop software components that integrate with our identity graphs, client data, and keying processes
Participate in capacity monitoring and planning
Develop and maintain technical documentation
Ensure data integrity and quality in development process, including conducting data research and analysis to identify trends and patterns
Work with cross-functional teams to ensure software systems meet business requirements and are scalable for future growth, including developing predictive models and enhancing data quality
Position will require that the individual understands all regulations and laws applicable to their assigned roles and responsibilities. Additionally, the individual will be responsible for the development, implementation, and regular maintenance of policies and procedures that govern the work of assigned roles and responsibilities, including compliance with the security requirements of ePHI.
Required Skill and Abilities
Experience with large-scale data processing and analytics platforms, including Hadoop, Spark,or related ecosystems
Knowledge of distributed computing concepts and experience with cloud-based infrastructure
Experience with fuzzy logic matching and tools
Experience with AWS infrastructure
Education and Experience
4+ years of experience working with Big Data technologies, including, Spark, and SQL databases
Proficiency in programming languages such as Scala, Java, Python, Shell or a combination thereof
Expertise in data modeling, database design, and development
Experience in building and maintaining data transformations (ETL) using SQL, Python, Scala, or Java
Ability to analyze, troubleshoot and performance tune queries
Ability to identify problems, and effectively communicate solutions to peers and management
Strong analytical and problem-solving skills to address complex software and data challenges
Ability to work effectively with cross-functional teams and communicate technical designs and solutions to non-technical stakeholders
About Throtle:
Throtle is a leading identity company trusted by the world's top brands and agencies located in Red Bank, NJ. At Throtle, we empower brands at scale with true individual-based marketing using a data-centric identity and onboarding approach.
Throtle is a company that truly values its employees and their work-life balance. We offer a comprehensive, competitive, and inclusive set of health, financial and other benefits that support your total well-being:
Competitive compensation.
Comprehensive benefits include Medical, Dental, and Vision.
Life insurance.
Long-Term Disability
A generous PTO program.
A 401k plan supported by a company match.
Half Day Summer Fridays (close at 1 p.m. Memorial Day to Labor Day).
Early Fridays (office closes at 3 p.m.).
Hybrid Schedule (Mondays and Fridays WFH)
The office is closed between Christmas and New Year.
Company-sponsored lunch at least 1x a month.
Professional Development Policy!
And much MORE!
Throtle is an equal-opportunity employer that is committed to diversity and inclusion in the workplace. We prohibit discrimination and harassment of any kind based on race, color, sex, religion, sexual orientation, national origin, disability, genetic information, pregnancy, or any other protected characteristic as outlined by federal, state, or local laws.
Data Engineer
Data Engineer Job 36 miles from Lakewood
Job Title: Data Engineer
Schedule: Monday-Friday, 8am - 5pm
Reports To: Manager, IT Data Analytics
______________________________________________________________________
JOB SUMMARY
As a Data Engineer at Radwell, you will be a key member of the data engineering team, responsible for designing, building, and maintaining our data environment. You will collaborate closely with fellow data engineers, reporting analysts, and software engineers to ensure our data pipelines are efficient, scalable, and reliable. The Radwell Data Engineer is a forward-thinking innovator who understands how to implement data solutions for next-generation challenges by working directly with the business to understand their data needs. Your work will help optimize activities across multiple functions such as pricing, warehousing, and procurement, driving data-driven decisions and operational efficiency throughout the organization.
ESSENTIAL DUTIES AND RESPONSIBILITIES
Build and maintain advanced data systems that bring together data from disparate sources in order to enable decision makers
Design, develop, and maintain scalable data pipelines and ETL processes using Databrick, azure data factory, SQL and Python
Build pipelines and prepare data for use by data scientists, data analysts, and other data systems
Take a solution-oriented problem-solving approach to develop creative solutions for business needs by partnering with business leaders and subject matter experts
Leverage cloud infrastructure and metadata driven frameworks to deliver value and scalability
May be modified from time to time. Other duties, tasks and work may be assigned
METRICS
Consistently meet monthly deadlines and objectives as agreed and typically described in monthly reviews or through other project planning efforts
Handle ongoing projects and day-to-day demands that are not identified in formal monthly objectives in a timely and accurate manner
Adherence to any related budget expenses
Adherence to/achievement of business objectives described in project benefit analyses
QUALIFICATIONS
Bachelor's degree in computer science, Engineering, or related field discipline preferred
Minimum 3 years' experience delivering data engineering solutions on a cloud platform
Minimum 3 years' experience implementing modern designs using at least one cloud-based solution
Minimum 3 years' experience with SQL or NoSQL databases
Minimum 3 years' experience with at least one programming language, with a strong preference towards Python
KNOWLEDGE & SKILLS REQUIRED
Advanced level proficiency with at least one ETL / data orchestration technology such as Azure Data Factory, SSIS, Informatica
Experience in cloud-based data warehousing and data lake solutions such as Databricks, Snowflake, or Redshift
Expertise with SQL, database design and data structures (star/snowflake schemas, de/normalized design)
Familiarity with DevOps tools such as git, TFS, CI/CD, Jira
Fundamental understanding of big data, open source, and data streaming concepts
Fundamental understanding of implementation of MLOps best practices
Familiarity with packaged software data extraction from systems such as NetSuite, Profit21 ERP as well as Salesforce CRM.
Experience sub-setting data into cube structures for business use in data reporting and analytics.
Ability to think strategically and provide recommendations utilizing traditional and modern architectural components based on business needs
Excellent written and verbal communication skills along with strong desire to work in cross functional teams. Ability to present extremely complex technical information in a business-friendly manner
A passion for staying up to date with the latest trends and advancements in the field
Team player who can coach and be coached as needed. Openness to adapting your approach for the betterment of the team
Strong technical ability and eagerness to learn new technologies and skills
Attitude to thrive in an entrepreneurial, fast-paced environment
EDUCATION & EXPERIENCE
High School Diploma or equivalent preferred
BS in computer science, statistics, or operations research or related technical discipline preferred
Minimum 3 years' experience in an analytics or data engineering role.
PHYSICAL DEMANDS
Continuous sitting and typing for extended periods
Occasional reaching/working overhead, climbing or balancing, stooping and occasional lifting of up to 25 pounds may be required.
WORK SCHEDULE
This is an exempt position which requires a work schedule that will achieve the results and objectives identified by the company. Generally the schedule for this position will be 8 am-5 pm, Monday through Friday, with one hour for lunch. Nights and weekends may be worked at based on current project and implementation needs, deadlines, and workload. Employee is expected to come to work on time and adhere to accepted time-off policies.
WORK ENVIRONMENT
The environment is an office setting. Dress attire is casual but professional in an office setting. All employees are required to wear “Radwear” (apparel with company logo) at all times once the initial supply (at company expense) has been received.
Postgres Database Data Modeler - GC/USC Only
Data Engineer Job 35 miles from Lakewood
Client is looking for PostGre SQL Database Data Modeler with strong skills in PostGre SQL DDL & DML , Procedures with Python/Unix
Please submit suitable profiles as per below.
Must Have :-
Strong Hands on experience in Postgre SQL SQL - Able to design & develop data models in PostgreSQL.
Strong knowledge of PostgreSQL syntax - DDL , DML's & experience in creating PostgreSQL Procedures and ability to design & optimize data models.
Knowledge of Python & basic Unix Scripting.
Strong communication skills and analytical ability as this position requires working directly with high level management.
Good to Have :-
Python API development & experience in connecting to postgre SQL using Python API's
Unix shell scripting
Data Architect
Data Engineer Job 38 miles from Lakewood
15+ years of Data Experience
Expertise in Data Strategy & Analytics Leadership
Strong command over - Azure Data Bricks, Synapse, SQL Server, PowerBI, Informatica Powercenter, IICS
Strong Data Architecture Experience
Data Governance
Data Security
Data Analytics
Business Intelligence Solutions
Data Audit and Compliance
Cross-functional Data Experience
Smart communicator
Power BI Full Stack Data Engineer
Data Engineer Job 31 miles from Lakewood
Technical Skills:
Business Intelligence: Power BI, Azure Data Factory (ADF), Azure Databricks, Azure Analysis Services(SSAS), Azure Data Lake Analytics, Azure Data Lake Store (ADLS), Power Apps, SSRS, SSIS, Azure Integration Runtime, DAX, XMLA, Power Query(M Language), Azure Event Hubs, Azure Stream Analytics
Database Technologies: Azure SQL Server, Azure SQL Database, MongoDB
Experience Required:
Power BI Reporting Design and develop business intelligence dashboards, analytical reports and data visualizations using power BI and providing access to different user groups
Expert knowledge of architecting and developing Power BI solutions by configuring the data security and assigning the licenses and sharing the reports and dashboards to the difference user groups in the organization. Expert in writing Parameterized Queries for generating Tabular reports, Sub reports using Global variables, Expressions, and Functions, Sorting the data, Dashboard, Scorecards, Drill - down and Drill-through reports, Defining Data sources and Subtotals for the reports
Design and develop business intelligence dashboards, analytical reports and data visualizations using power BI by creating multiple measures using DAX expressions for user groups
Use of Power BI Power Pivot to develop data analysis prototype, use Power View and Power Map to visualize reports.
Developed interactive reports and dashboards in Power BI to provide actionable insights and facilitate informed decision-making use of Dynamics 365.
Develop MDX Scripts to create datasets to perform reporting and included interactive drill down reports, report models and dashboard reports.
Experience in Databricks lakehouse and DBT for processing data and building models.
Data Engineering Experience in implementing Microsoft BI/Azure BI solutions like Azure Data Factory, Power BI, Azure Databricks, Azure Analysis Services, SQL Server Integration Services, SQL Server Reporting Services. Good Understanding of Azure Big data technologies like Azure Data Lake Analytics, Azure Data Lake Store, Azure Data Factory and created POC in moving the data from flat files and SQL Server using U-SQL jobs.
Good knowledge in implementing various business rules for Data Extraction, Transforming and Loading (ETL) between Homogenous and Heterogeneous Systems using Azure Data Factory(ADF).
Involved in creating the notebooks for moving data from raw to stage and then to curated zones using Databricks.
Expert in data warehouse development starting from inception to implementation and ongoing support, strong understanding of BI application design and development principles using Normalization and De-Normalization techniques
Involved in developing complex Azure Analysis Services tabular databases and deploying the same in Microsoft azure and scheduling the cube through Azure Automation Runbook.
Extensive experience in developing tabular and multidimensional SSAS Cubes, Aggregation, KPIs, Measures, Partitioning Cube, Data Mining Models, deploying and Processing SSAS objects.
Lead Data Analyst
Data Engineer Job 33 miles from Lakewood
LOCAL CANDIDATES ONLY - THIS ROLE IS ONSITE 4 DAYS/WEEK IN WOODBRIDGE, NJ. NOT OPEN FOR REMOTE.
Our client is seeking a highly technical and communicative Senior Data Analyst with strong testing/QA skills to join a growing analytics and data quality team. This role is not a development position but is deeply technical, focused on back-end data validation, modeling, and testing. You will play a key part in interpreting requirements, directing a small team subset, and serving as a bridge between business users and the IT/data warehousing team. The ideal candidate is a strong analytical thinker, an effective communicator, and a hands-on data expert.
Key Skills:
Strong SQL skills - must be able to write complex queries, understand joins and schema structures
Solid experience in technical QA and backend data validation
Proficiency with data modeling concepts and ability to read technical documentation
Experience working with or querying in Snowflake (preferred)
Past working experience with AWS services, especially S3 buckets
Knowledge of environments including AIX, DB2, Oracle, SQL, Informatica, and DBT
Excellent communication skills
Lead Data Scientist
Data Engineer Job 32 miles from Lakewood
Technical Lead, Data Science and AI
We seek a dynamic, well-rounded Technical Lead to support our AI/ML initiatives and working directly with our pharmaceutical customers. You'll serve as the primary technical point of contact for clients while coordinating deliverables with our offshore teams. This role requires strong data engineering expertise, advanced SQL skills, and the ability to work independently to deliver results without relying on teams for some deliverables. You'll implement clinical NLP models using Hugging Face frameworks (BioBERT, ClinicalBERT, spa Cy, BERT) while also handling data cleansing, ETL processes, and data quality management. Additionally, expertise in agentic AI, generative AI applications, and LLMOps is critical for this position. The candidate should be skilled integrator comfortable with no-code AI tools like Replit and Lovable.dev. The ideal candidate can lead the offshore team while also rolling up their sleeves to deliver technical solutions directly when needed.
Key Responsibilities
Technical Implementation
Implement NLP/NER models (BioBERT, ClinicalBERT, BERT, spa Cy) for clinical entity extraction
Design and develop ETL processes and data pipelines for clinical trial documents
Build and deploy agentic AI systems that can perform autonomous tasks with clinical data
Create and optimize RAG (Retrieval Augmented Generation) systems for clinical applications
Implement LLMOps practices for managing, deploying, and monitoring language models
Develop generative AI interfaces for clinical data exploration and analysis
Write complex SQL queries for data extraction, transformation, and analysis
Perform data cleansing, normalization, and quality improvement activities
Build and maintain data processing workflows with minimal dependency on other teams
Implement data validation and quality assurance processes
Independently deliver technical solutions when timelines require direct intervention
Team Coordination
Coordinate technical tasks with offshore development teams
Document technical specifications and requirements
Provide technical guidance to junior team members
Work with the Product team to translate business needs into technical tasks
Facilitate communication between technical teams and clients
Client Engagement & Delivery
Serve as primary technical point of contact for pharmaceutical clients
Lead technical discussions and presentations with clients
Translate client requirements into clear technical specifications
Coordinate deliverables and ensure timely completion of client projects
Provide regular status updates to clients on technical implementation
Demonstrate technical solutions and new features to clients
Troubleshoot and resolve client-reported issues
Manage client expectations for technical deliverables
Qualifications
Required Skills
Bachelor's degree in Computer Science, Data Science, or related field, Master's preferred
5+ years experience in NLP/ML with focus on clinical or biomedical applications
Strong SQL skills for complex data manipulation and analysis
Experience with Hugging Face transformers (BERT, BioBERT, ClinicalBERT)
Expert Python programming skills with focus on data processing
Experience with clinical data or biomedical terminology
Excellent client communication and presentation skills
Proven ability to coordinate technical deliverables and manage timelines
Experience working as technical liaison with clients
Strong project coordination experience with offshore teams
Technical Knowledge
Agentic AI frameworks and autonomous system design
LLMOps practices including model serving, monitoring, and versioning
Generative AI application development and RAG system implementation
Vector databases and semantic search technologies
Advanced database design, SQL optimization, and query performance tuning
ETL architecture and implementation best practices
Data cleansing techniques and methodologies
Data pipeline development and maintenance
NLP/NER model implementation and fine-tuning
OCR and document processing techniques
Clinical data structures and formats
Domain Knowledge
Specialized knowledge of oncology, immunology and neuroscience clinical trials, treatments, biomarkers, and therapeutic approaches
Familiarity with oncology drug classifications, mechanisms of action, and treatment protocols
Familiarity with pharmaceutical development pipelines
Senior Data Engineer
Data Engineer Job 34 miles from Lakewood
We are
At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron's progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honored with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,500+, and has 58 offices in 21 countries within key global markets.
Our challenge
We are looking for an experienced Senior Software Engineer with over 10 years of hands-on development experience specifically in Azure cloud technologies. The ideal candidate should possess expertise in Azure Data Factory (ADF), Databricks, Spark, and SQL, while having a strong foundation in Delta Lake architecture and Big Data solutions. Familiarity with Equities trading data, AI/ML techniques, and Data Mesh principles will be advantageous.
Additional Information
The base salary for this position will vary based on geography and other factors. In accordance with law, the base salary for this role if filled within Iselin, NJ is $120k - $130k/year & benefits (see below).
The Role
Responsibilities:
Design, develop, and maintain scalable and efficient data solutions leveraging Azure cloud technologies, ADF, Databricks, Spark, and SQL.
Implement and optimize Delta Lake architecture and other Big Data solutions to enhance performance and scalability.
Work collaboratively with cross-functional teams to comprehend and address business needs related to Equities trading data.
Apply AI/ML methodologies to improve data processing and analytical capabilities.
Participate in the development and implementation of Data Mesh architecture to enhance data accessibility and usability.
Requirements:
Minimum of 10 years of hands-on development experience in Azure cloud, ADF, Databricks, Spark, and SQL.
Strong understanding of Delta Lake architecture and Big Data solutions.
Familiarity with Equities trading data is essential.
Knowledge of AI/ML practices and Data Mesh concepts is a plus.
Excellent problem-solving skills with the ability to work independently and collaboratively in a team environment.
Strong verbal and written communication skills to interact effectively with stakeholders.
Strong Banking / Financial Domain is required.
It would be great if you also had:
Familiarity with modern software development practices, tools, and methodologies.
Ability to mentor and guide junior team members, fostering knowledge sharing and skill development.
We offer:
A highly competitive compensation and benefits package.
A multinational organization with 58 offices in 21 countries and the possibility to work abroad.
10 days of paid annual leave (plus sick leave and national holidays).
Maternity & paternity leave plans.
A comprehensive insurance plan including medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region).
Retirement savings plans.
A higher education certification policy.
Commuter benefits (varies by region).
Extensive training opportunities, focused on skills, substantive knowledge, and personal development.
On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses.
Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups.
Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms.
A flat and approachable organization.
A truly diverse, fun-loving, and global work culture.
S YNECHRON'S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Lead Architect - Salesforce/AWS/Data Warehouse
Data Engineer Job 31 miles from Lakewood
We are looking for a highly skilled and experienced Senior Lead Technical Architect with over 10 years of experience to join our team. The successful candidate will be responsible for leading the architecture design and implementation of complex technology solutions, working closely with cross-functional teams, to ensure alignment with business objectives and compliance with industry and technical standards. The role requires significant experience in managing large-scale technical projects, particularly in utilizing tools such as Salesforce, AWS, or Snowflake(Data Warehouse). If the candidate does not have direct experience with any one tool above, but has experience with the other two, experience with comparable tools will be considered. This role is critical to our ongoing efforts to modernize and optimize our technical infrastructure.
Responsibilities:
Lead the design, development, and implementation of scalable and reliable technical architectures that meet business requirements.
Oversee multiple projects, ensuring timely delivery and adherence to project scope and budget. Provide technical guidance to project teams.
Utilize significant experience with at least two of the following tools: Salesforce, AWS, Snowflake, or other comparable technologies, to drive project success.
Work closely with stakeholders, including business analysts, developers, and project managers, to gather requirements and translate them into technical specifications.
Develop and maintain a technical roadmap that aligns with the organization's goals and objectives. Propose improvements and innovative solutions.
Implement strategies for optimizing system performance, scalability, and security. Conduct regular reviews and assessments.
Ensure thorough documentation of architecture, design decisions, and system configurations. Maintain adherence to industry standards and best practices.
Provide technical leadership and mentorship to junior architects and development teams. Foster a culture of continuous learning and improvement.
Oversee the integration of various systems and technologies to ensure seamless operation and data flow. Conduct integration testing and validation.
Identify potential risks and develop mitigation strategies to address them proactively. Conduct regular risk assessments.
Qualifications:
* Technical Architecture - Required, 10 Years
* Salesforce - Required, 8 Years
* AWS - Required, 8 Years
* Snowflake (Data Warehouse) - Highly Desired, 5 Years
* Snowflake Comparable Tools (e.g., Google BigQuery, AWS RedShift) - Nice to Have, 8 Years
* Project Management - Required, 8 Years
* System Integration - Required, 8 Years
* Performance Optimization - Required, 8 Years
* Leadership and Mentorship - Required, 8 Years
* Communication and Collaboration - Required, 8 Years
* Risk Management - Required, 8 Years
* Documentation and Compliance - Required, 8 Years
* Cloud Computing - Required, 8 Years
* Problem-Solving - Required, 8 Years
* Agile Methodologies - Required, 8 Years
* DevOps Practices - Required, 8 Years
* Data Security and Compliance - Required, 7 Years
* Education/Public Sector Experience - Desired, 5 Years
Cloud DevOps Engineer
Data Engineer Job 31 miles from Lakewood
The Cloud DevOps Engineer primary responsibilities will be defining and provisioning infrastructure resources using code, enabling automated and repeatable deployments. This eliminates manual configuration, reduces errors, and ensures consistency across environments. Cloud DevOps Engineers must be proficient in tools like Terraform, CloudFormation, GCP, and Azure Resource Manager to define infrastructure components such as virtual machines, networks, storage, and databases.
Furthermore, they are responsible for automating various aspects of the software development lifecycle, including build processes, testing, and deployments. Automation streamlines workflows, accelerates delivery times, and improves overall efficiency. This often involves scripting using languages like Python, Bash, or PowerShell, and integrating with automation platforms such as Ansible, Chef, or Puppet. A Cloud DevOps Engineer should have a deep understanding of automation principles and the ability to design and implement automated solutions that meet specific business requirements. By leveraging automation, Cloud DevOps Engineers enable organizations to achieve greater agility, scalability, and resilience in their cloud environments.
Need to have hands-on experience across multi-cloud environments ( Azure, GCP) and extensive expertise in DevOps automation, CI/CD pipeline management, infrastructure-as-code (IaC), containerization, and cloud security.
The ideal candidate will be responsible for building, managing, and optimizing DevOps pipelines, driving innovation, and introducing new solutions to streamline our cloud and DevOps processes. This role requires a deep understanding of cloud architecture, automation, microservices, and security best practices to support scalable and resilient cloud-based applications.
Essential Functions
Automated Provisioning - CI/CD automates testing, reducing manual checks. This ensures quicker feedback. Develop and maintain a comprehensive security architecture covering on-premises, cloud, and hybrid environments.
Consistency -ensuring consistent environments across development stages. This reduces errors during deployment. Design security solutions that align with business objectives while mitigating risk.
Real-time Monitoring - Cloud monitoring provides real-time insights into system performance. Version Control - Infrastructure configurations are version-controlled. This enables tracking and reverting changes easily. Ensure Zero Trust principles, network segmentation, and security best practices are enforced across the enterprise.
Proactive Issue Detection - Identify and resolve potential issues. This can be done before they impact users. Architect cloud security strategies, leveraging best practices for Azure and GCP.
Centralized Logging - Centralized logging helps aggregate and analyze logs. Logs are collected from different sources. Lead security investigations, conduct root cause analysis, and document incident response actions.
Configuration Management - Scripting tools manage configurations. They ensure consistency. Provide threat intelligence and recommend proactive security measures to mitigate risk.
Infrastructure Orchestration - Orchestration tools automate infrastructure setup. This streamlines processes. Implement and maintain cloud security controls in Azure and GCP.
Security Policies - Implementing security policies to protect data and systems. Optimize cloud security solutions for web and network protection.
Access Control - Managing access control to limit unauthorized access. This is essential for security. Ensure Active Directory (AD) and IAM policies align with best practices.
Cross-functional Teams - Working with development, operations, and security teams.
Documentation - Creating and maintaining clear documentation for processes. Oversee the continuous best practice is leveraged for data classification policies and enforce data protection controls.
Additional Functions
Cloud Infrastructure & Automation:
Design, build, and manage secure, scalable, and high-availability cloud environments across AWS, Azure, and GCP.
Develop and maintain Infrastructure-as-Code (IaC) solutions using Terraform, CloudFormation, Pulumi, and Ansible.
Implement multi-cloud strategies, hybrid cloud deployments, and cloud networking solutions.
Optimize cloud costs through monitoring, auto-scaling, and resource provisioning techniques.
DevOps & CI/CD Pipeline Management:
Architect, implement, and maintain CI/CD pipelines using tools like Jenkins, GitHub Actions, GitLab CI/CD, Azure DevOps, and AWS CodePipeline.
Automate build, test, deployment, and rollback processes for applications and infrastructure.
Ensure secure DevOps practices, including secrets management, policy-as-code, and automated compliance.
Integrate observability, logging, and monitoring solutions within the pipeline (e.g., ELK, Prometheus, Grafana, Datadog, New Relic).
Containerization & Kubernetes:
Deploy and manage containerized applications using Docker and Kubernetes (EKS, AKS, GKE).
Implement Kubernetes security best practices, monitoring, and autoscaling.
Automate Kubernetes deployments using Helm, Kustomize, and GitOps tools like ArgoCD and Flux.
Security & Compliance in Cloud and DevOps:
Ensure cloud security best practices by implementing IAM, RBAC, security groups, and encryption.
Deploy security tools such as AWS Security Hub, Azure Security Center, Prisma Cloud, and GuardDuty.
Manage identity and access controls for DevOps tools and cloud services.
Conduct regular security assessments and vulnerability management across cloud workloads.
Observability, Performance, & Reliability Engineering:
Implement SRE (Site Reliability Engineering) principles to improve system reliability and incident response.
Develop and integrate logging and monitoring solutions using Prometheus, Grafana, Datadog, Splunk, or ELK.
Build automated alerting and response mechanisms for cloud and DevOps environments.
Implement chaos engineering to improve system resilience and fault tolerance.
Innovation & Continuous Improvement:
Continuously evaluate new DevOps and cloud technologies to improve efficiency and scalability.
Automate repetitive tasks and enhance self-service capabilities for development teams.
Participate in architecture discussions, design reviews, and proof-of-concept (PoC) implementations.
Lead DevOps culture adoption across teams by driving best practices and training initiatives. Conduct threat modeling, risk assessments, and security reviews for applications, infrastructure, and networks.
Qualifications
7-10 years of hands-on experience in Cloud & DevOps Engineering roles.
Expertise in multi-cloud environments (Azure and GCP).
Deep understanding of DevOps methodologies, CI/CD pipeline design, and automation.
Strong experience with Terraform, Ansible, CloudFormation, and Kubernetes.
Proficiency in Jenkins, GitHub Actions, GitLab CI/CD, and Azure DevOps.
Hands-on experience with Docker, Kubernetes (EKS, AKS, GKE), Helm, and GitOps (ArgoCD, Flux).
Knowledge of cloud security, IAM, RBAC, and compliance frameworks (SOC2, NIST, ISO 27001).
Proficiency in scripting and automation using Python, Bash, PowerShell, or Go.
Experience with observability tools like Prometheus, Grafana, ELK, and Datadog.
Working Conditions & Physical Demands
This position requires in person office presence at least 4x a week.
User Interface Engineer
Data Engineer Job 32 miles from Lakewood
Front End UI Engineer
Onsite/Hybrid: Hybrid onsite days Tue, Wed, Thurs
12 Months Contract
Pay Rate: $95.49/hr on W2(Without Benefits)
Work Schedule: Normal business hours
Top 3 Skills/Must Haves:
Proficiency in HTML, CSS, and JavaScript.
Experience with front-end frameworks and libraries (e.g., React, Angular, Vue.js).
Understanding of web design principles and best practices.
Knowledge of version control systems (e.g., Git).
Purpose and Scope of the Position:
The Front-End UI Engineer will be responsible for developing the front-end application. The front-end application will fetch all necessary data from our REST API server. The Front-End UI should be based on a modern software stack that is maintainable, testable and well documented. All code written will be maintained in a Github repo and deployed using a Github CICD deployment process.
User Interface (UI) Development:
Translating design mockups and wireframes into functional, interactive user interfaces using HTML, CSS, and JavaScript.
Building reusable components and libraries for future use.
Ensuring cross-browser compatibility and responsiveness across different devices.
Optimizing web pages for speed, performance, and scalability.
Debugging and troubleshooting front-end issues.
Integrating APIs and back-end systems.
Collaboration:
Working closely with UX/UI designers, back-end developers, and other stakeholders to ensure a cohesive and functional user experience.
Participating in design and code reviews.
Technical Skills:
Proficiency in HTML, CSS, and JavaScript.
Experience with front-end frameworks and libraries (e.g., React, Angular, Vue.js).
Understanding of web design principles and best practices.
Knowledge of version control systems (e.g., Git).
Ability to write clean, maintainable, and well-documented code.
Other Skills:
Strong problem-solving and analytical skills.
Excellent communication and interpersonal skills.
Ability to stay up-to-date with the latest front-end technologies and trends.
Education and Experience:
Bachelor's degree in Computer Science, Information Technology, or related field or equivalent professional experience.
No Exposure to hazards or disagreeable conditions
Travel required: No
Python Developer/ Application Support Engineer - W2 only
Data Engineer Job 31 miles from Lakewood
Job Title: Python Developer / Application Support Engineer
Job ID # 81649
Rate type: W2 only
Top Skills:
• Proficiency in programming languages such as Python, SQL, R, and JavaScript.
• Experience developing web applications in frameworks like Streamlit, Shiny, Vue, React, etc.
• Proven experience with core AWS services, including but not limited to EC2, S3, and RDS.
• BioPharma experience preferred
Note: The interview process for this role will require a real-time coding assignment.
Position Summary:
We are seeking a dedicated and skilled LLM Application Support Engineer to join our technical team responsible for providing L1, L2, and L3 specialized application support for Large Language Model (LLM)-based applications. The successful candidate will also be responsible for implementing automated testing frameworks for both applications and LLMs specifically. This role requires a solid understanding of LLMs, application support, and automated testing methodologies.
Responsibilities include, but are not limited to, the following:
Application Support:
• Provide L1, L2, and L3 support for LLM-based applications, ensuring timely resolution of issues.
• Ensure the scalability and performance of LLM-based applications to handle large datasets and complex queries.
• Identify opportunities for process improvements and implement innovative solutions to enhance the efficiency and effectiveness of LLM deployments.
• Monitor application performance and proactively identify potential issues.
• Troubleshoot and resolve application issues, escalating to higher levels of support when necessary.
• Collaborate with development teams to address and resolve complex technical issues.
• Maintain detailed documentation of support activities, including issue resolution steps and best practices.
• Ensure compliance with BMS's data security and regulatory requirements during support activities.
• Conduct root cause analysis for recurring issues and implement preventive measures.
Automated Testing Frameworks:
• Develop and implement automated testing frameworks for LLM-based applications.
• Design and execute automated tests to validate the functionality, performance, and reliability of applications and LLMs.
• Create and maintain test scripts for unit tests, integration tests, and end-to-end tests.
• Collaborate with development teams to integrate automated testing into the CI/CD pipeline.
• Monitor and analyze test results, identifying and reporting defects.
• Continuously improve automated testing processes and frameworks to enhance test coverage and efficiency.
• Ensure compliance with BMS's data security and regulatory requirements during testing activities.
• Stay updated with the latest trends and best practices in automated testing and LLM technologies.
Basic Qualifications:
• Bachelor's Degree with 2-4 years of academic/industry experience in application support or development.
• Preference for candidates with experience in machine learning or LLM-based applications.
Preferred Qualifications:
Technical:
• Proven experience with core AWS services, including but not limited to EC2, S3, and RDS.
• Demonstrated proficiency with current software engineering methodologies, such as Agile SDLC approaches, distributed source code control, project management, issue tracking, and CI/CD tools and processes.
• Proficiency in programming languages such as Python, SQL, R, and JavaScript.
• Strong understanding of machine learning frameworks and libraries (e.g., TensorFlow, PyTorch, scikit-learn).
• Solid understanding of container strategies such as Docker and ECS.
• Excellent skills and deep knowledge of databases such as Postgres, Elasticsearch, and Redshift, including distributed database design, SQL vs. NoSQL, and database optimizations.
• Experience developing web applications in frameworks like Streamlit, Shiny, Vue, React, etc.
• Proficiency with predictive modeling approaches and/or experience working with large language models.
• Experience with testing frameworks and automated testing tools (e.g., pytest, Selenium, JUnit).
Senior Fixed Income Data and Analytics Developer
Data Engineer Job 31 miles from Lakewood
Our client is a $151Billion global asset management firm with expertise across fixed income markets including municipal bonds, high yield bonds, investment grade bonds, structured credit, and emerging markets debt. A boutique styled company with multiple offices in the USA plus London and Dublin. Ninety years of dedication to clients and a commitment to employees.
We are seeking a motivated and detail-oriented Senior Developer with a strong background in fixed income analytics such as portfolio characteristics and FI benchmarks, and recent asset management experience. Proficiency in SQL and database design is a must and a basic ability to develop scripts in Python. Work directly with senior stakeholders including portfolio managers and traders. Join a new Enterprise Data team, building from scratch and migrating legacy data.
Salary: $200K - $220K plus bonus.
Responsibilities:
Develop and maintain a new Snowflake data warehouse focused on buy side investment data domains including security master, holdings, portfolios, transactions, benchmarks.
Transition and continue to develop a Fixed Income investments datastore focused on portfolio characteristics and breakdowns, aggregate performance and attribution data and portfolio vs benchmark comparisons used by investments and client service departments.
Develop and maintain data scripts focused on data capture, transformation and extraction. Develop APIs to expose data.
Understand, analyze and develop data structures to fully support the security and portfolio level data captured or calculated by our systems and ensure it is made available to Investments Technology or direct end users.
Help ensure data accuracy, consistency, and availability across enterprise data platforms.
The client runs a lean environment with high visibility and ownership where your expertise is valued and rewarded.
Required Qualifications:
This hybrid position requires 3 days a week on-site in the Princeton office.
Candidates for this position must be authorized to work for any employer in the US without visa transfer or sponsorship.
If you meet the required qualifications and are interested in this role, please apply today.
The Solomon Page Distinction
Our teams, comprised of subject matter experts, develop an interest in your preferences and goals and we act as an advisor for your career advancement. Solomon Page has an extensive network of established clients which allows us to present opportunities that are well-suited to your respective goals and needs - this specialized approach sets us apart in the industries we serve.
About Solomon Page
Founded in 1990, Solomon Page is a specialty niche provider of staffing and executive search solutions across a wide array of functions and industries. The success of Solomon Page reflects an organic growth strategy supported by a highly entrepreneurial culture. Acting as a strategic partner to our clients and candidates, we focus on providing customized solutions and building long-term relationships based on trust, respect, and the consistent delivery of excellent results. For more information and additional opportunities, visit: solomonpage.com and connect with us on Facebook, and LinkedIn.
Opportunity Awaits.
Lead Architect - Salesforce/AWS/Data Warehouse
Data Engineer Job 31 miles from Lakewood
Role : Lead Architect - Salesforce/AWS/Data Warehouse
We are looking for a highly skilled and experienced Senior Lead Technical Architect with over 12+ years of experience to join our team. The successful candidate will be responsible for leading the architecture design and implementation of complex technology solutions, working closely with cross-functional teams, to ensure alignment with business objectives and compliance with industry and technical standards. The role requires significant experience in managing large-scale technical projects, particularly in utilizing tools such as Salesforce, AWS, or Snowflake(Data Warehouse). If the candidate does not have direct experience with any one tool above, but has experience with the other two, experience with comparable tools will be considered. This role is critical to our ongoing efforts to modernize and optimize our technical infrastructure.
Responsibilities:
Lead the design, development, and implementation of scalable and reliable technical architectures that meet business requirements.
Oversee multiple projects, ensuring timely delivery and adherence to project scope and budget. Provide technical guidance to project teams.
Utilize significant experience with at least two of the following tools: Salesforce, AWS, Snowflake, or other comparable technologies, to drive project success.
Work closely with stakeholders, including business analysts, developers, and project managers, to gather requirements and translate them into technical specifications.
Develop and maintain a technical roadmap that aligns with the organization's goals and objectives. Propose improvements and innovative solutions.
Implement strategies for optimizing system performance, scalability, and security. Conduct regular reviews and assessments.
Ensure thorough documentation of architecture, design decisions, and system configurations. Maintain adherence to industry standards and best practices.
Provide technical leadership and mentorship to junior architects and development teams. Foster a culture of continuous learning and improvement.
Oversee the integration of various systems and technologies to ensure seamless operation and data flow. Conduct integration testing and validation.
Identify potential risks and develop mitigation strategies to address them proactively. Conduct regular risk assessments.
Qualifications:
- Bachelor's or master's degree in computer science, information technology, or a related field.
- 12 + year of overall technical architect experience
- 8+ years proven experience as an integration architect with a focus on Salesforce, Snowflake (or similar cloud data warehouse ), AWS, ETL tools, and Business Rules Engines.
- Strong knowledge of integration patterns, API design, and data integration strategies.
- Certification in relevant technologies (e.g., Salesforce, Snowflake, AWS) is a plus.
- Proficiency in programming languages (e.g., Python), database systems, and middleware technologies.
- Experience with cloud-based integration platforms and services.
- Excellent problem-solving and analytical skills.
- Exceptional communication and interpersonal skills.
- Strong project management and leadership abilities.
- A deep understanding of data governance, security, and compliance.
If I missed your call ! Please drop me a mail.
Thank you,
Harish
Accounts Manager/Talent Acquisition
Astir IT Solutions, Inc - An E-Verified Company
Email:*******************
Direct : ***********788
50 Cragwood Rd. Suite # 219, South Plainfield, NJ 07080
***************
Java Software Engineer
Data Engineer Job 20 miles from Lakewood
BeaconFire is based in Central NJ, specializing in Software Development, Web Development,
and Business Intelligence; we are looking for candidates with a strong background in Software
.
Responsibilities:
● Develop software and web applications using Java 8/J2EE/Java EE (and higher), React.js,Angular2+, SQL, Spring, HTML5, CSS, JavaScript and TypeScript among other tools;
● Write scalable, secure, maintainable code that powers our clients' platforms;
● Create, deploy, and maintain automated system tests;
● Work with Testers to understand defects opened and resolves them in a timely manner;
● Support continuous improvement by investigating alternatives and technologies and
presenting these for architectural review;
● Collaborate effectively with other team members to accomplish shared user story
and sprint goals;
Basic Qualifications:
● Experience in programming language JavaScript or similar (e.g. Java, Python, C, C++, C#,
etc.) an understanding of the software development life cycle;
● Basic programming skills using object-oriented programming (OOP) languages with
in-depth knowledge of common APIs and data structures like Collections, Maps, lists, Sets etc.
● Knowledge of relational databases (e.g. SQL Server, Oracle) basic SQL query language
skills
Preferred Qualifications:
● Master's Degree in Computer Science (CS)
● 0-1 year of practical experience in Java coding
● Experience using Spring, Maven and Angular frameworks, HTML, CSS
● Knowledge with other contemporary Java technologies (e.g. Weblogic, RabbitMQ, Tomcat, etc.) · Knowledge of JSP, J2EE, and JDBC
Compensation : $65,000.00 to $80,000.00 /year
BeaconFire is an e-verified company. Work visa sponsorship is available.
Lead Software Engineer
Data Engineer Job 20 miles from Lakewood
Job Title: Lead Engineer (Ruby, React, GraphQL, PostgreSQL)
FTE Role
Job details
About the Role
We are looking for a Lead Engineer who will play a hybrid role between an individual contributor and a team manager. You will be responsible for leading a team of 4-5 engineers, while also contributing hands-on to the development of our platform. Your expertise in Ruby on Rails, React, GraphQL, and PostgreSQL will be crucial in driving our technical vision and ensuring scalable and high-performing applications.
Responsibilities
Lead, mentor, and manage a team of 4-5 engineers, providing technical guidance, code reviews, and career development support.
Design, develop, and maintain scalable backend services using Ruby on Rails and GraphQL APIs.
Build and enhance front-end applications with React for a seamless user experience.
Optimize database performance and scalability using PostgreSQL.
Collaborate with cross-functional teams including product managers, designers, and other engineers to deliver high-quality features.
Ensure best practices in software development, security, and system architecture.
Participate in hiring processes to grow and strengthen the engineering team.
Requirements
5+ years of experience in software development, with a strong background in Ruby on Rails and React.
Hands-on experience with GraphQL APIs and PostgreSQL.
2+ years of experience leading or mentoring a team of engineers.
Strong understanding of software architecture, cloud platforms, and modern development methodologies.
Experience with CI/CD, testing frameworks, and version control (Git).
Ability to balance technical leadership with hands-on development.
Excellent problem-solving and communication skills.
Nice to Have
Experience with DevOps, AWS, or containerization (Docker/Kubernetes).
Knowledge of frontend state management libraries like Redux.
Previous experience in a fast-paced startup environment.
Why Join Us?
Work on a cutting-edge tech stack with a talented and passionate team.
Competitive salary, equity options, and excellent benefits.
Flexible work environment with remote-friendly options.
Opportunities for leadership growth and career advancement.
Release Engineer
Data Engineer Job 31 miles from Lakewood
We are Looking for Release Engineer
Responsible for the entire lifecycle of release quality and triage, including deployment, problem-solving, issue mitigation, and developing tools to optimize software release operations
Log Analysis and Monitoring tools Experience
Strong in C++ or Embedded Platforms
Networking domain Background
Software Engineering Lead
Data Engineer Job 33 miles from Lakewood
Our client, a leading wealth management firm dedicated to delivering cutting-edge digital solutions is seeking a Software Engineering Lead to drive the development of innovative, scalable, and secure applications. Your expertise in full-stack development, cloud infrastructure, and automation will guide a high-performing team in building transformative digital products.
Key Responsibilities:
Technical Leadership: Lead the design, development, and deployment of high-quality software using React, AWS, TypeScript, JavaScript, and Python. Drive architecture decisions for scalability and performance.
Cloud Development: Leverage AWS services (Lambda, DynamoDB, Redshift) to optimize application performance and cost-efficiency while ensuring security best practices.
API Development & Integration: Oversee API design and management using WSO2, REST, and GraphQL, ensuring performance and scalability.
CI/CD & Automation: Implement CI/CD pipelines with tools like BitBucket, Docker, and Terraform to streamline development and optimize releases.
Team Leadership & Mentorship: Lead and mentor engineers, fostering a culture of collaboration and technical excellence.
Collaboration & Communication: Work cross-functionally with Business Solution Engineers, QA, and stakeholders to translate business needs into technical solutions.
Qualifications:
Bachelor's degree in Computer Science or related field.
10+ years of software engineering experience.
Expertise in React for building responsive applications.
Strong AWS experience, including serverless computing and database management.
Advanced proficiency in TypeScript, JavaScript, and Python.
Experience designing and managing APIs (WSO2, REST, GraphQL).
Familiarity with Agile/Scrum and modern DevOps practices.
Proven leadership experience mentoring engineers.
Cloud certifications preferred.
Knowledge of wealth management disciplines (portfolio management, trading, client advisory) is a plus.
This position requires 3 days a week in the office.
Software Engineer(W2)
Data Engineer Job 31 miles from Lakewood
We Are Hiring - Multiple Positions for OPT Candidates Only!
Are you an OPT candidate with 4+ years of professional experience in India and 6+ months of U.S. internship experience? We are actively seeking motivated and skilled professionals to join our team in a variety of exciting roles.
Minimum 5+ years of total IT experience, with a proven track record of delivering DevOps solutions in large-scale, complex environments
Prior experience with top-tier multinational companies in India, and hands-on exposure to U.S.-based projects
Strong expertise in designing, implementing, and managing DevOps practices across hybrid and cloud-native infrastructures
Education: Completed Bachelor's degree prior to May 2018, preferably in Computer Science, Information Technology, or a related field
Technical Proficiency:
Programming:
Java, Javascript, React, Angular, .Net Development, Android
☁️ Cloud Platforms:
AWS: EC2, S3, Lambda, IAM, VPC, CloudFormation, CloudWatch, Route 53, RDS, EKS, ECS
🔁 CI/CD Tools:
Jenkins, GitLab CI, AWS CodePipeline, GitHub Actions
📦 Containerization & Orchestration:
Docker, Kubernetes, Helm
📜 Infrastructure as Code (IaC):
Terraform, AWS CloudFormation
💻 Scripting Languages:
Shell, Bash, Python, Groovy
🔐 Security & Access Management:
IAM roles and policies, AWS KMS, Secrets Manager, vulnerability management tools
🧪 Monitoring & Logging:
Prometheus, Grafana, ELK Stack, AWS CloudWatch
🛠 Build Tools:
Maven, Gradle, NPM
🔄 Automation Tools:
Ansible, Chef, or Puppet
🌐 Networking Knowledge:
VPC, VPN, Subnetting, Load Balancers, Route Tables
🖥️ OS & Version Control:
Strong background in Linux/Unix systems, Git, Bitbucket
Preferred Qualifications:
AWS Certifications such as AWS Certified DevOps Engineer - Professional or AWS Solutions Architect - Associate
Bachelor's degree in a technical field (completed before 2018)
Hands-on experience in production support and large-scale deployments
Thanks & Regards
Vasu
Baanyan Software Services Inc
100 Metroplex Drive, Suite 100, 1st Floor, Edison, NJ. 08817
Phone: ************ Extn: 207 | Direct: ************
Email: **************** | ***************
An E-Verified Compan
y