Senior Solutions Architect, Gen AI, Startups, AGS North America Startups
Data architect job at Amazon
Do you like startups? Are you excited about helping Gen AI startups? Are you interested in the intersection of cloud computing, generative AI, and disrupting innovation? Yes? We have a role you might find interesting. Startups are the large enterprises of the future. These young companies are founded by ambitious people who have a desire to build something meaningful and challenge the status quo. They seek to address underserved customers, or to challenge incumbents. They usually operate in an environment of scarcity: whether that's capital, engineering resource, or experience. This is where you come in.
We are looking for technical builders who love the idea of working with early stage startups to help them as they grow. In this role, you'll work directly with a variety of interesting customers and diagnostics startups) and help them make the best (and sometimes the most pragmatic) technical decisions along the way. You'll have a chance to build enduring relationships with these companies and establish yourself as a trusted advisor.
As a member of the Generative AI Startups team, you will work directly with customers to help them successfully leverage AWS technology to develop, train, tune, and deploy the next generation of generative AI foundation models at scale.
As well as spending time working directly with customers, you'll also get plenty of time to learn new technologies and keep your skills fresh. We have 200+ services across a range of different categories and it's important that we can help startups take advantages of the right ones. You'll also play an important role as an advocate with our product teams to make sure we are building the right products and features for the startups you work with. And for the customers you don't get to work with on a 1:1 basis you'll share knowledge more broadly by working on technical content and presenting at events.
Key job responsibilities
-Help a diverse range of generative AI-focused startups to adopt the right architecture at each part of their lifecycle
-Support startups in architecting scalable, reliable and secure solutions
-Support adoption of a broad range of AWS services to deliver business value and accelerate growth
-Support the evolution and roadmap of the AWS platform and services, connecting our engineering teams with our customers for feedback
-Establish and build technical relationships within the startup ecosystem, including accelerators, incubators and VCs
-Develop startup specific technical content, such as blog posts, sample code and solutions, to assist customers solve technical problems and reduce time-to-market
A day in the life
Startups are the large enterprises of the future. These young companies are founded by ambitious people who have a desire to build something meaningful and to challenge the status quo. To address underserved customers or to challenge incumbents. They usually operate in an environment of scarcity: whether that's capital, engineering resource, or experience. This is where you come in.
The AWS Startup Solutions Architecture team is dedicated to working with startup companies as they build their businesses. We're here to help them deploy the best, most scalable, most secure, cost effective and easy to operate architectures.
About the team
Diverse Experiences
Amazon values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn't followed a traditional path, or includes alternative experiences, don't let it stop you from applying.
Why AWS
Amazon Web Services (AWS) is the world's most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating - that's why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses.
Work/Life Balance
We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why flexible work hours and arrangements are part of our culture. When we feel supported in the workplace and at home, there's nothing we can't achieve in the cloud.
Inclusive Team Culture
Here at AWS, it's in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (diversity) conferences, inspire us to never stop embracing our uniqueness.
Mentorship and Career Growth
We're continuously raising our performance bar as we strive to become Earth's Best Employer. That's why you'll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional.
BASIC QUALIFICATIONS- 8+ years of specific technology domain areas (e.g. software development, cloud computing, systems engineering, infrastructure, security, networking, data & analytics) experience
- 3+ years of design, implementation, or consulting in applications and infrastructures experience
PREFERRED QUALIFICATIONS- 5+ years of infrastructure architecture, database architecture and networking experience
- Experience working with end user or developer communities
Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status.
Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit ********************************************************* for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
Our compensation reflects the cost of labor across several US geographic markets. The base pay for this position ranges from $138,200/year in our lowest geographic market up to $239,000/year in our highest geographic market. Pay is based on a number of factors including market location and may vary depending on job-related knowledge, skills, and experience. Amazon is a total compensation company. Dependent on the position offered, equity, sign-on payments, and other forms of compensation may be provided as part of a total compensation package, in addition to a full range of medical, financial, and/or other benefits. For more information, please visit ******************************************************** This position will remain posted until filled. Applicants should apply via our internal or external career site.
Principal Architect - Gen AI & Agentic Systems (Hybrid)
Minneapolis, MN jobs
Principal Architect - Gen AI & Agentic Systems
Employment Type: Full-Time
About the role
As a Gen AI and Agentic AI Architect, you will lead the design and deployment of scalable AI ecosystems for Cognizant's strategic clients. You'll drive AI strategy, build modular platforms, and deliver industry-specific solutions that transform enterprise operations.
In this role, you will:
Architect cloud-native AI platforms using LLMs, SLMs, and multi-agent orchestration.
Advise Fortune 500 clients on AI strategy and transformation.
Deliver verticalized AI use cases across industries.
Lead model development, fine-tuning, and optimization.
Establish MLOps/LLMOps pipelines and governance frameworks.
Build and mentor AI teams and practices.
Co-innovate with hyperscalers, startups, and ISVs.
Contribute to thought leadership through publications and forums.
Work model
This is a hybrid position requiring 2 to 3 days/week in a Cognizant or client office in Phoenix, AZ or Minneapolis, MN. We support flexible work arrangements and a healthy work-life balance through our wellbeing programs.
What you need to have to be considered
15+ years in IT and architecture, including hands-on engineering experience.
5+ years in AI/ML, with 1+ year in Generative & Agentic AI.
Expertise in model training (SFT, RLHF, LoRA), RAG, and evaluation.
Certifications in at least two cloud platforms (AWS, Azure, GCP).
Strong background in MLOps/LLMOps and AI governance.
Experience advising CxOs and leading strategic AI engagements.
Proven leadership in building cross-functional AI teams.
These will help you stand out
Publications or patents in Agentic AI or LLMOps.
Thought leadership in industry events or media.
Deep domain expertise in one or more verticals.
Experience with AgentOps, model evaluation, and AI observability tools.
Salary and Other Compensation:
Applicants will be accepted till 1/08/2026
Cognizant will only consider applicants for this position who are legally authorized to work in the United States without company sponsorship.
*Please note, this role is not able to offer visa transfer or sponsorship now or in the future*
The annual salary for this position will be in the range of $120K-$165K depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
Medical/Dental/Vision/Life Insurance
Paid holidays plus Paid Time Off
401(k) plan and contributions
Long-term/Short-term Disability
Paid Parental Leave
Employee Stock Purchase Plan
Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
Our strength is built on our ability to work together. Our diverse backgrounds offer different perspectives and new ways of thinking. It encourages lively discussions, creativity, productivity, and helps us build better solutions for our clients. We want someone who thrives in this setting and is inspired to craft meaningful solutions through true collaboration.
If you are content with ambiguity, excited by change, and excel through autonomy, we'd love to hear from you!
Apply Now!
Data Analyst - IV (USC/GC Only)
San Francisco, CA jobs
VARITE is looking for a qualified Data Analyst - IV for one of its clients.
WHAT THE CLIENT DOES?
A U.S. based regional bank and financial services company that provides a wide range of banking products and services, including personal banking, business banking, mortgages, wealth management, and investment services.
WHAT WE DO?
Established in the Year 2000, VARITE is an award-winning minority business enterprise providing global consulting & staffing services to Fortune 1000 companies and government agencies. With 850+ global consultants, VARITE is committed to delivering excellence to its customers by leveraging its global experience and expertise in providing comprehensive scientific, engineering, technical, and non-technical staff augmentation and talent acquisition services.
**Note: We only work on W2, we do not work on C2C/1099, we do not provide Visa sponsorship, kindly refrain from applying if you do not meet this criteria, thank you for understanding!**
HERE'S WHAT YOU'LL DO:
Job Title: Data Analyst - IV (USC/GC Only)
Duration: 12 Months
Pay Rate: $90-106/hr. on W2
Work Mode: Remote role
Job Description
Essential Responsibilities:
• This is a hands-on role. Applicants should not be managers or architects.
• Provide primary technical support to end user community for problems related to software, data issues, data communication and processing errors.
• Monitor application, production and/or implementation support, both technical and user, for new releases.
• Consult on application testing strategies, document observations and coordinate responses.
• Maintain project and application information including schedule, system scope, requirements, and other pertinent documentation. Make recommendations to management regarding process improvements. Coordinate the information flow between business owners and development staff. Coordinate tasks, functional reviews and application development sessions between the development site and technical work groups ensuring steps are executed timely.
• Document end-user interactions into the automated call logging tools for tracking and productivity purposes.
• Validate proposed technical approaches to system or application problem logs and make recommendations for mediation to management.
• Ability to perform involved, independent research and develop highly creative work products and proposals.
• Demonstrate a high degree of creativity in addressing problems/issues and opportunities.
• Demonstrated concentration on customer service orientation.
• Ability to work effectively in a highly matrix or virtual organization.
• Strong verbal and written communication skills with an ability to present/facilitate effective training sessions.
• Strong experience in Agile delivery, requirements engineering, backlog development and refinement.
• Strong experience in stakeholder engagement.
• Experience working with distributed teams, and providers of software solutions.
• Broad knowledge of application and data modelling techniques with the ability to identify root cause issues and appropriate solutions.
• Strong experience working in Agile environment/SAFe framework and Agile management tools like Jira.
• Excellent analytical and design skills.
• CBAP, Agile Analysis Certification or similar certifications preferred.
• Must be willing to be on call periodically from 8am-2pm ET or 2pm-8pm ET Monday through Friday.
• Hands-on for performing technical analysis and troubleshooting on highly complex integrated systems, including diagnostic approaches, determining root causes, and identifying solutions on a data lake house hosted in a cloud environment utilizing technologies such as R, sparklyr, Databricks, Starburst, Collibra, Tableau, Python and/or Alteryx.
Quals--
• Sr. Technical Analyst roles for one-year assignments to implement and manage data products, ensuring that our data pipelines are scalable, secure, and efficient.
Requirements:
Specific skill mixes we need from the tech stack:
o Databricks AND (R or sparklyr) OR (Pyspark or Python)
o Databricks AND (Starburst OR Tableau)
o Databricks AND (Collibra OR Altreyx)
HANDS-ON Experience for the following:
• Databricks -Pyspark, Data Quality Framework, Building custom dashboards
• R/SparklyR - as a data analyst
• Pyspark/Python -with scripting and automation, performance tuning and debug capabilities
• Starburst - as a data analyst or Starburst developer or support engineer
• Collibra - as a Collibra developer or support engineer
• Alteryx - as a developer or support engineer
• Tableau - as a developer with performance tuning / debug capabilities
• Immuta - As a security analyst. Must have good Role Based and Attribute Based Security knowledge.
Manufacturing Data Analyst - Local - w2
Camden, AR jobs
We are seeking an experienced Lead Manufacturing Engineer (Level 5) to join our Statistical Process Control (SPC) & Data Analytics team. The ideal candidate will be highly skilled in data analytics, Power BI dashboard development, and Power Apps creation to support manufacturing optimization and automation.
This role requires strong expertise in data mining, linking datasets in Power BI, and designing analytical tools that enhance decision-making and operational efficiency.
Key Responsibilities
Lead the creation, development, and deployment of advanced Power BI dashboards to deliver actionable insights.
Design, develop, and implement Power Applications to streamline and automate manufacturing workflows.
Execute data mining and establish automated data relationships in Power BI to support reporting and process optimization.
Utilize Statistical Process Control (SPC) methods for monitoring and improving manufacturing processes.
Partner with cross-functional teams to identify data analytics needs and deliver effective solutions.
Maintain detailed documentation of analytics tools, processes, and best practices.
Mentor junior engineers in data analytics, Power BI, and Power Apps development.
Present technical findings to both technical and non-technical stakeholders clearly and effectively.
Lead complex projects with minimal supervision.
Required Qualifications
Bachelor's degree in Mechanical Engineering, Electrical Engineering, Software Engineering, Computer Science, or a related field.
Minimum 9 years of experience in Manufacturing Engineering or a related technical area.
Proven experience in data analytics and Power BI dashboard development.
Expertise in Power Applications design and deployment.
Strong skills in data mining and building automated data relationships in Power BI.
Extensive experience with SPC methodologies.
Excellent analytical, problem-solving, and communication skills.
Proficiency with MS Office tools (Excel, PowerPoint, Word).
Ability to work independently and lead large-scale, complex projects.
Demonstrated experience in mentoring and guiding junior engineers.
Desired Qualifications
Master's degree in Engineering or Computer Science.
Knowledge of Snowflake or other database management systems.
Familiarity with Lean Manufacturing and Six Sigma methodologies.
Experience with additional analytics tools such as Minitab, JMP, or Inductive Automation platforms.
Data Architect
Piscataway, NJ jobs
Data Architecture & Modeling
Design and maintain enterprise-level logical, conceptual, and physical data models.
Define data standards, naming conventions, metadata structures, and modeling best practices.
Ensure scalability, performance, and alignment of data models with business requirements.
Data Governance & Quality
Implement and enforce data governance principles and policies.
Define data ownership, stewardship, data lineage, and lifecycle management.
Lead initiatives to improve data quality, consistency, and compliance.
Enterprise Data Management
Develop enterprise data strategies, including data integration, master data management (MDM), and reference data frameworks.
Define and oversee the enterprise data architecture blueprint.
Ensure alignment between business vision and data technology roadmaps.
Data Management Analyst
Charlotte, NC jobs
Need strong Data Management resources that have hands-on data provisioning and ability to distribute the data.
Moderate to Advanced SQL skills (writing complex queries is a plus)
Commercial Lending (iHub, WICS, WICDR systems)/Commercial Banking Background
Metadata/Data Governance
Regulatory Reporting
Data Management Framework
SQL
Data Quality
Data Architect
Ridgefield, NJ jobs
Immediate need for a talented Data Architect. This is a 12 month contract opportunity with long-term potential and is located in Basking Ridge, NJ (Hybrid). Please review the job description below and contact me ASAP if you are interested.
Job ID:25-93859
Pay Range: $110 - $120/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Requirements and Technology Experience:
Key Skills; ETL, LTMC, SaaS .
5 years as a Data Architect
5 years in ETL
3 years in LTMC
Our client is a leading Telecom Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, colour, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Data Warehouse Architect
Albany, NY jobs
For more details, please connect with Gautmi Jain at ************ or email at *********************
This project is to modernize aging Data Warehouse and Data Analytic systems utilizing new on-premises cloud- and container-based technologies
Required Skills:
84 months experience with a Government Taxation organization, which includes system design, development, maintenance and end user support of Tax & Revenue Accounting Systems and Tax Identification Systems.
84 months experience in design and administration for DB2 LUW on AIX including the use of PureXML and Database Partitioning Feature, traditional DBA tasks of design, tuning, storage management, memory/buffering, backup/recovery and object maintenance. The experience must be on both high-volume transactional systems and analytical systems and involve data purging and roll off, data movement and the shredding natively stored XML data to flattened structures.
84 months of database design and programming experience using Computer Associates IDMS database, programming for an IDMS database using its DMLO language as well as logical design, database navigation, analysis, writing application specifications, testing, documentation.
84 Months experience in administration and support of IBM's InfoSphere Data Replication software to replicate data between databases on DB2 for z/OS and DB2 LUW on AIX (ver 9.5 or higher) including the setup, configuration and maintenance of IBM InfoSphere Data Replication Change Data Capture (CDC), in usage for 24/7 high volume transactional processing databases.
84 months experience with Application performance analysis and tuning
84 months experience with troubleshooting applications on an IBM Mainframe running DB2 z/OS and CICS/MQ
Data Architect
Phoenix, AZ jobs
The Senior Data Engineer & Test in Phoenix 85029 will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects
Key Responsibilities
1. Data Engineering: Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.
2. Workflow Orchestration: Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability.
3. CI/CD Pipeline Development: Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions.
4. Testing & Quality: Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines.
5. Secure Development: Implement secure coding best practices and design patterns throughout the development lifecycle.
6. Collaboration: Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions.
7. Documentation: Create and maintain technical documentation, including process/data flow diagrams and system design artifacts.
8. Mentorship: Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices.
9. Troubleshooting: Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks.
Cross-Team Knowledge Sharing: Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage. Includes all above skills, plus the following;
· Minimum of 10+ years overall IT experience
· Experienced in waterfall, iterative, and agile methodologies
Technical Requirment:
1. Hands-on Data Engineering : Minimum 5+ yearsof practical experience building production-grade data pipelines using Python and PySpark.
2. Airflow Expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments.
3. CI/CD for Data Projects : Ability to build and maintain CI/CD pipelinesfor data engineering workflows, including automated testing and deployment**.
4. Cloud & Containers: Experience with containerization (Docker and cloud platforms (GCP) for data engineering workloads. Appreciation for twelve-factor design principles
5. Python Fluency : Ability to write object-oriented Python code manage dependencies, and follow industry best practices
6. Version Control: Proficiency with **Git** for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows).
7. Unix/Linux: Strong command-line skills** in Unix-like environments.
8. SQL : Solid understanding of SQL for data ingestion and analysis.
9. Collaborative Development : Comfortable with code reviews, pair programming and usingremote collaboration tools effectively.
10. Engineering Mindset: Writes code with an eye for maintainability and testability; excited to build production-grade software
11. Education: Bachelor's or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience.
Senior Global Data Warehouse Architect
New York, NY jobs
Job Title: Senior Global Data Warehouse Architect
Duration: Full-Time/Permanent
We are seeking a Senior Global Data Warehouse Architect to design and deliver modern, scalable, and high-performing data platforms that enable advanced financial and operational analytics. The role focuses on defining enterprise data architecture strategy, optimizing data flows, and ensuring data structures are reliable, efficient, and built for analytical depth and scalability.
The architect will develop data platforms that deliver reconciled financial and operational metrics, reduce time-to-insight, and establish standards for data governance, lineage, and model reusability to elevate data quality across the enterprise.
**LAW FIRM EXPERIENCE REQUIRED.
Responsibilities
Design and implement enterprise-grade data architectures on Databricks, Snowflake, or similar platforms for real-time and batch analytical workloads.
Build efficient ELT/ETL workflows using Spark, Python, and SQL to ingest, transform, and curate large volumes of structured and unstructured financial and operational data.
Create and maintain dimensional, normalized, and semantic data models to support financial reporting, performance analytics, and forecasting.
Lead integration design for financial and HRIS systems, ensuring accurate reconciliation and alignment with enterprise data warehouse schemas.
Enforce data governance policies, lineage tracking, and metadata standards to ensure data accuracy, consistency, and regulatory compliance.
Tune query performance, optimize compute and storage costs, and implement data caching and partitioning strategies to improve processing speed and efficiency.
Collaborate with data engineers, BI developers, finance analysts, and technology teams to translate business requirements into scalable data solutions.
Evaluate new tools, frameworks, and architectures to enhance data platform performance and scalability.
Qualifications
8+ years of experience in data architecture, data engineering, or BI engineering roles.
Hands-on experience architecting solutions on Databricks and/or Snowflake.
Strong proficiency in SQL, Python, and Spark.
Deep understanding of data modeling principles (3NF, star/snowflake schemas, data vault).
Proven experience designing data lakes and lakehouse architectures.
Familiarity with financial data structures, accounting schemas, and practice management systems such as Elite 3E or Aderant.
Experience supporting financial planning, billing, profitability, and time & expense analytics.
Experience with cloud platforms (AWS, Azure, or GCP), data platform tools (Databricks, Snowflake, S3, Delta Lake), and BI/visualization tools (Power BI, Tableau, or equivalent).
Strong analytical, problem-solving, and communication skills.
Ability to influence and align technical direction across teams.
Relevant certifications preferred: Databricks Certified Data Engineer/Architect, Snowflake SnowPro Advanced: Architect, TOGAF or other enterprise data architecture credentials.
Data Architect
Alpharetta, GA jobs
Strong knowledge of data platforms (Snowflake, Databricks, Azure Synapse, BigQuery, etc.).
Experience with ETL/ELT tools, data modeling, and data governance frameworks.
Familiarity with BI tools (Power BI, Tableau, Qlik) and advanced analytics concepts.
Presales Skills
Excellent communication and presentation skills for both technical and business audiences.
Ability to create solution proposals, architecture diagrams, and cost models.
Experience
15+ years in data engineering, analytics, or related roles, with at least 3 years in presales or solution consulting.
Exposure to cloud platforms (AWS, Azure, GCP) and hybrid architectures.
Education
Bachelor's or master's degree in computer science, Data Science, or related field.
Data Architect
Blue Ash, OH jobs
Since 2006, CoStrategix has defined and implemented digital transformation initiatives, data and analytics capabilities, and digital commerce solutions for Fortune 500 and mid-market customers.
CoStrategix provides thought leadership, strategy, and comprehensive end-to-end technology execution to help organizations transform and stay competitive in today's digital world. As a Gixer (employee) at CoStrategix, you will have broad exposure to diverse industries and technologies.
You will work on leading-edge digital projects in areas of Data Engineering, Data Governance, Data Strategy, AI, Cloud,. Gixers operate at the leading edge of technologies, and our projects require compelling human interfaces and modern data platforms.
This role is based at our culture hub in Blue Ash, Ohio.
About this role:
As a Data Architect at CoStrategix, you will define, orchestrate, and implement modern data platforms and architectures. This role is about understanding the current state of data ecosystems, mapping existing data flows and structures, creating an architectural blueprint, and then implementing data strategies and governance frameworks in rapid cycles.
In this role, you will provide the following:
Strategic & Consultative Responsibilities
Act as a trusted data advisor to client stakeholders, clearly communicating trade-offs, guiding decision-making, and influencing the adoption of modern data practices.
Lead stakeholder interviews and working sessions to elicit requirements, clarify use cases, and align on priorities, scope, and success metrics.
Create phased data roadmaps with clear milestones, dependencies, and value outcomes (e.g., time-to-insight, cost reduction, risk reduction) and track progress against them.
Provide architectural input into scoping and pricing of data engagements; ensure solutions balance value, risk, and cost, and support delivery teams in staying aligned to scope and architectural guardrails.
Work closely with sales and account teams to understand customer objectives and translate them into practical, scalable data architecture and solution designs.
Participate in pre-sales engagements, discovery workshops, proposal development, client presentations, and proof-of-concept activities to showcase solution feasibility and value.
Data Governance, Quality & Operating Model
Bring consultative competencies around data governance and data quality, helping clients define guiding principles, policies, and operating models.
Lead development of comprehensive data strategies for clients that align with their business priorities.
Design processes for metadata management, lineage tracking, and master data management (MDM), including stewardship roles and workflows.
Establish and maintain data quality standards, metrics, and monitoring processes to ensure accurate, complete, and timely data across critical domains.
Develop semantic models and curated datasets, guide adoption of data cataloging and data literacy programs.
Enterprise & Solution Architecture
Design and maintain conceptual, logical, and physical data architectures to support enterprise, analytical, and operational systems.
Assess and recommend data platforms, cloud services, and emerging technologies to meet business needs, while collaborating with Cloud, DevOps, and Security Architects to ensure architectural alignment.
Partner with Data Analysts, BI Developers, and Data Scientists to ensure data architectures enable analytics, visualization, and AI/ML initiatives.
Define non-functional requirements (performance, scalability, resilience, cost, security, and compliance) for data solutions, and ensure they are addressed in the architecture and design.
Maintain architecture decision records, reference architectures, and reusable patterns; define and promote standards and best practices for data modeling, integration, and consumption across teams.
Implementation, Delivery & Enablement
Lead the implementation of scalable, secure, and high-performing data and transformation frameworks that unify data across platforms and enable real-time, batch, and event-driven use cases.
Define and enforce data design standards, patterns, and best practices during implementation to ensure consistency, maintainability, and performance.
Mentor and coach engineering and analytics teams in data design principles, governance frameworks, and architectural discipline to ensure consistency and quality in delivery.
Qualifications:
Bachelor's Degree in Math, Statistics, Computer Science, Information Technology, or a related field
8+ years of experience in data management and architecture roles
3 to 5 years of leading data strategy, governance, or modernization efforts
3 to 5 years of pre-sales, client solutioning, and/or consulting engagement in Data Management
Experience designing and implementing modern data architectures
Current understanding of best practices regarding data security, governance, and regulatory compliance
Experience in data modeling, data engineering, and analytics platform architecture
Experience with data engineering tooling such as Databricks, Snowflake, Synapse, BigQuery, Kafka, and dbt
Experience with software development, DevOps best practices, and automation methodologies
Excellent leadership and negotiation skills are necessary to work effectively with colleagues at various levels of the organization and across multiple locations
Communicate complex issues crisply and concisely to various levels of management
Coaching and mentoring skills - ability to adapt to all levels of the organization
Strong collaboration skills and excellent verbal and written communication skills
About CoStrategix
We make CoStrategix an awesome place to work, offering a total rewards package that includes comprehensive benefits starting on day one. Benefits include medical, dental, vision, disability, and life insurance, as well as an EAP and 401(k) retirement plan. We are a flexible hybrid workplace committed to a culture of curiosity, collaboration, learning, self-improvement, and, above all, fun. We have been named a finalist for the Cincinnati Business Courier's Best Places to Work Awards for 4 consecutive years.
Do the Right Thing. Always.
At CoStrategix, we are passionate about our core values. Diversity, equity & inclusion (DE&I) are part of our core values Every Gixer (employee) has an opportunity for success regardless of their race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Creating an environment where everyone, from any background, can do their best work is the right thing to do.
Data Analyst
Piscataway, NJ jobs
Role: HEDIS Data Analyst
Key Responsibilities
Must have HEDIS Analytics & Reporting
Develop, validate, and maintain HEDIS measure calculations based on NCQA specifications.
Extract, transform, and analyse data from claims, eligibility, EHR, pharmacy, and lab systems.
Support annual HEDIS submissions, including numerator/denominator validation, audit documentation, and data quality checks.
Conduct trending, gap analysis, and performance monitoring for all HEDIS measures.
Generate weekly/monthly dashboards for stakeholder consumption.
Data Engineering & ETL Support
Develop and optimize SQL queries, ETL pipelines, and data transformations.
Collaborate with IT/data engineering teams to improve data quality and resolve inconsistencies.
Create field mappings, data lineage documentation, and technical specifications.
Chart Review / Hybrid Measures
Produce and manage provider retrieve lists for medical record collection.
Validate chart abstraction output for accuracy and completeness.
Support provider outreach related to missing documentation and care gaps.
Cross-Functional Collaboration
Partner with Quality Improvement, Clinical, Provider Relations, and Compliance teams to support initiatives.
Provide guidance on data anomalies, measure interpretation, and technical HEDIS questions.
Present insights and trend analyses to leadership.
Audit & Compliance
Assist with HEDIS Compliance Audit preparation and documentation.
Ensure adherence to NCQA guidelines and regulatory requirements.
Maintain audit trails and detailed reporting artifacts.
Technical Skills
Advanced SQL (required).
Experience working with healthcare payer datasets (claims, eligibility, provider, EMR, pharmacy, lab).
Understanding of healthcare coding standards (ICD-10, CPT, HCPCS, LOINC, NDC).
HEDIS / Healthcare Domain Knowledge
Deep understanding of NCQA HEDIS technical specifications.
Experience with hybrid and administrative measures.
Knowledge of HEDIS audit processes and regulatory timelines.
Familiarity with care gap workflows and quality improvement strategies.
Cloud Data Architect
McLean, VA jobs
Purpose:
As a Cloud Data Architect, you'll be at the forefront of innovation - guiding clients and teams through the design and implementation of cutting-edge solutions using Databricks, modern data platforms, and cloud-native technologies. In this role, you won't just architect solutions -you'll help grow a thriving Analytics & Data Management practice, act as a trusted Databricks SME, and bring a business-first mindset to every challenge. You'll have the opportunity to lead delivery efforts, build transformative data solutions, and cultivate strategic relationships with Fortune-500 organizations.
Key Result Areas and Activities:
Architect and deliver scalable, cloud-native data solutions across various industries.
Lead data strategy workshops and AI/ML readiness assessments.
Develop solution blueprints leveraging Databricks (Lakehouse, Delta Lake, MLflow, Unity Catalog).
Conduct architecture reviews and build proof-of-concept (PoC) prototypes on platforms like Databricks, AWS, Azure, and Snowflake.
Engage with stakeholders to define and align future-state data strategies with business outcomes.
Mentor and lead data engineering and architecture teams.
Drive innovation and thought leadership across client engagements and internal practice areas.
Promote FinOps practices, ensuring cost optimization within multi-cloud deployments.
Support client relationship management and engagement expansion through consulting excellence.
Roles & Responsibilities
Essential Skills:
10+ years of experience designing and delivering scalable data architecture and solutions.
5+ years in consulting, with demonstrated client-facing leadership.
Expertise in Databricks ecosystem including Delta Lake, Lakehouse, Unity Catalog, and MLflow.
Strong hands-on knowledge of cloud platforms (Azure, AWS, Databricks, and Snowflake).
Proficiency in Spark and Python for data engineering and processing tasks.
Solid grasp of enterprise data architecture frameworks such as TOGAF and DAMA.
Demonstrated ability to lead and mentor teams, manage multiple projects, and drive delivery excellence.
Excellent communication skills with proven ability to consult and influence executive stakeholders.
Desirable Skills
Recognized thought leadership in emerging data and AI technologies.
Experience with FinOps in multi-cloud environments, particularly with Databricks and AWS cost optimization.
Familiarity with data governance and data quality best practices at the enterprise level.
Knowledge of DevOps and MLOps pipelines in cloud environments.
Qualifications:
Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or related fields.
Professional certifications in Databricks, AWS, Azure, or Snowflake preferred.
TOGAF, DAMA, or other architecture framework certifications are a plus.
Qualities:
Self-motivated and focused on delivering outcomes for a fast growing team and firm
Able to communicate persuasively through speaking, writing, and client presentations
Able to consult, write, and present persuasively
Able to work in a self-organized and cross-functional team
Able to iterate based on new information, peer reviews, and feedback
Able to work with teams and clients in different time zones
Research focused mindset
"Infocepts is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law."
Data Analyst
Deerfield, IL jobs
Our client is currently seeking a Data Analyst
Hybrid to the northern suburbs of Chicago
W2 Only
In this role, you'll evaluate usage, adoption, and performance of IIN systems, identify data mining opportunities, and integrate findings with key performance indicators. You'll collaborate with cross-functional teams to deliver dashboards, reports, and scorecards that inform strategy.
What You'll Do
Analyze usage, adoption, and efficacy of IIN systems.
Identify and execute data mining opportunities.
Integrate insights with traditional KPIs.
Provide analytical support for projects and stakeholders.
Develop and share dashboards, reports, and scorecards.
Ensure data accuracy, format, and availability.
What We're Looking For
Bachelor's degree in Statistics, Data Science, Mathematics, or related field.
Strong analytical skills and ability to interpret complex datasets.
Advanced Excel skills (pivot tables, VLOOKUP, Power Query, visualization).
Expertise in Tableau for dashboard design and optimization.
Hands-on experience with Snowflake and SQL for data warehousing and analysis.
Strong business acumen and ability to align data strategies with goals.
Excellent communication and collaboration skills.
Nice to Have
Experience with KPI reporting and BI tools.
Familiarity with advanced data visualization techniques.
Rate: $65-90/HR
Data Analyst - Statistician
Chicago, IL jobs
We're looking for a data-driven professional to join our team as a Data Analyst - Statistician, responsible for turning complex data into actionable insights that drive business decisions. In this role, you'll evaluate usage, adoption, and efficacy systems, uncover data mining opportunities, and integrate findings with traditional KPIs. You'll also support analytical projects, guide data formatting and availability, and deliver impactful reports, scorecards, and dashboards.
What You'll Do:
Analyze and interpret large datasets to identify trends and opportunities.
Evaluate system performance and effectiveness, integrating insights with KPIs.
Provide analytical support for cross-functional projects.
Design and share reports, dashboards, and scorecards to inform stakeholders.
Collaborate with teams to ensure data accessibility and quality.
Perform additional duties as assigned by your manager.
Core Competencies:
Data Analysis & Interpretation: Ability to transform complex data into actionable insights.
Excel Expertise: Advanced skills in pivot tables, VLOOKUP, Power Query, and data visualization.
Tableau: Skilled in creating interactive dashboards and optimizing performance.
Snowflake: Hands-on experience with data warehousing and efficient SQL queries.
SQL Proficiency: Strong ability to write and optimize queries for data extraction and analysis.
Business Acumen: Align data strategies with organizational goals.
Collaboration & Communication: Work effectively with product owners, engineers, and stakeholders.
Data Analyst with Pyspark & AB Testing
Sunnyvale, CA jobs
In the role of Data Analyst with Pyspark & AB Testing experience, you will be responsible for solving business problem for our Retail/CPG clients through data driven insights. Your role will combine a judicious and tactful blend of Hi-Tech domain, Analytical experience, Client interfacing skills, and solution design and business acumen so your insights not only enlighten the clients but also pave the way for launching deeper into future analysis. You will advise clients and internal teams through short burst high-impact engagements on identifying business problem, solving business problem through suitable approaches and techniques pertaining to learning and technology. You will effectively communicate data-derived insights to non-technical audiences appropriately and mentor junior or aspiring consultant/data scientists. You will play a key role in building components of a framework or product while addressing practical business problems. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued.
Required Qualifications:
Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
Candidate must be located within commuting distance of Sunnyvale, CA or be willing to relocate to these locations. This position may require travel within the US.
Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time.
At least 4 years of experience in Information Technology
Proven years of applied experience in exploratory data analysis, devising, deploying and servicing statistical models
Strong hands-on experience with data mining and data visualization, Tableau, A/B Testing, SQL for developing and creating data pipelines to source and transform Data
Strong experience using Python, Advanced SQL and PySpark
Preferred Qualifications:
Advanced degree with Master's or above in area of quantitative discipline such as Statistics, Applied Math, Operations Research, Computer Science, Engineering or Physics or a related field
Marketing domain background (Web analytics, click stream data analysis, and other KPI's on marketing campaigns)
Knowledge of Machine Learning techniques
The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email or face to face. Travel may be required as per the job requirements.
EEO/About Us :
About Us
Infosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With over four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.
EEO
Infosys provides equal employment opportunities to applicants and employees without regard to race; color; sex; gender identity; sexual orientation; religious practices and observances; national origin; pregnancy, childbirth, or related medical conditions; status as a protected veteran or spouse/family member of a protected veteran; or disability. Infosys provides equal employment opportunities to applicants and employees without regard to race; color; sex; gender identity; sexual orientation; religious practices and observances; national origin; pregnancy, childbirth, or related medical conditions; status as a protected veteran or spouse/family member of a protected veteran; or disability.
Data Architect
Sunrise, FL jobs
JD:
14+ years of overall IT experience with expertise in Data landscape - Data Warehouse, Data lake etc.
Hands on experience in Big Data and Hadoop ecosystem; Strong skills in SQL, Python or Spark
Proficient in Data Warehousing concepts and Customer Data Management (Customer 360)
Experience in GCP platform - Dataflow, Dataproc, Kubernetes containers etc.
Expertise in deep Data exploration and Data analysis
Excellent communication and inter personal skills
Senior Data Architect
Edison, NJ jobs
Act as a Enterprise Architect, supporting architecture reviews, design decisions, and strategic planning.
Design and implement scalable data warehouse and analytics solutions on AWS and Snowflake.
Develop and optimize SQL, ETL/ELT pipelines, and data models to support reporting and analytics.
Collaborate with cross-functional teams (data engineering, application development, infrastructure) to align on architecture best practices and ensure consistency across solutions.
Evaluate and recommend technologies, tools, and frameworks to improve data processing efficiency and reliability.
Provide guidance and mentorship to data engineering teams, enforcing data governance, quality, and security standards.
Troubleshoot complex data and performance issues and propose long-term architectural solutions.
Support capacity planning, cost optimization, and environment management within AWS/Snowflake ecosystems.
About ValueMomentum
ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, supported by deep domain and technology capabilities. We offer a comprehensive suite of advisory, development, implementation, and maintenance services across the entire P&C insurance value chain. This includes Underwriting, Claims, Distribution, and more, empowering insurers to stay ahead with sustained growth, high performance, and enhanced stakeholder value. Trusted by over 75 insurers, ValueMomentum is one of the largest standalone insurance-focused solutions providers to the US insurance industry.
Integration Architect (B2Bi/EDI)
Dearborn, MI jobs
Miracle Software Systems, Inc., are actively looking for Architect Senior to work W2 position for one of our direct client at Dearborn, MI.
Architect Senior
Dearborn, MI
Long term
Responsible for application design, installation, configuration and development support activities for B2Bi, Web and Mainframe applications • Demonstrate a good level of expertise in one or more technical environments (software packages, programming language, database, operating system), and a good knowledge of EDI Solution • Responsible for quality of solution in each environment and for each trading partner • Coordinate with related work streams / teams including infrastructure teams, Customer Service Division and Manufacturing • Perform systems tuning and diagnostics on UNIX, LINUX and WINDOWS platforms
Skills Required: Solution Architecture, Unix, Network Protocols & Standards, Session Initiation Protocols, GCP
Skills Preferred: Mainframe Systems
Experience Required:
5+ years of experience in a wide range of business areas of supply chain management including system development, business intelligence, business processes and customer relations. • 5+ years of experience with Enterprise Application Integration, middleware, or other integration experience. • 5+ years of experience with architecture diagrams, sequence diagrams, uml diagrams, network connectivity and communication protocols • Proficiency with EDI mapping tools, ANSI X12, EDIFACT, UNIX. LINUX, Windows, SQL Server, MQ Series
Experience Preferred: Familiarity with web and mainframe applications
Education Required: Bachelor's Degree
Education Preferred:
Additional Safety Training/Licensing/Personal Protection Requirements:
Additional Information :
Hybrid with 4 days a week expected onsite."