Only green card or citizen
JOB RESPONSIBILITIES
Job title: Data Specialist (PIM)
DATA MANAGEMENT:
· Manage and maintain product data within Stibo STEP and PDX platforms, including data entry, enrichment, and validation.
· Collaborate with insulation product managers and technical experts to gather and validate product information, including specifications, certifications, and compliance data.
· Implement data governance policies and standards to maintain data quality and integrity within PIM systems.
· Coordinate with external partners and customer portals to ensure accurate and timely delivery of product information.
· Monitor and analyze data syndication performance metrics, identifying opportunities for optimization and improvement.
· Provide support and training to internal stakeholders on PIM systems and data management best practices.
· Troubleshoot technical issues and escalate to IT or vendor partners as needed for resolution.
· Stay informed of industry trends and best practices in PIM systems and data management, incorporating new techniques and technologies to enhance efficiency and effectiveness.
COLLABORATE AND FOSTER TEAMWORK:
· Collaborate cross functionally within a matrix organization between IT, business, and digital teams.
JOB REQUIREMENTS
MINIMUM QUALIFICATIONS:
· Bachelor's degree in business administration, information systems, or related field.
· At least 2 years of experience in product information management, data analysis, or related field.
· Proficiency in PIM systems, particularly Stibo STEP and PDX, with experience in data entry, enrichment, and validation.
· Strong understanding of data governance principles and best practices, including data quality management and compliance.
· Excellent communication and interpersonal skills, with the ability to collaborate effectively with internal and external stakeholders.
· Detail-oriented with a focus on accuracy and precision.
· Ability to prioritize and manage multiple tasks in a fast-paced environment.
· Proficiency in Microsoft Office Suite, particularly Excel and PowerPoint.
$58k-74k yearly est. 2d ago
Looking for a job?
Let Zippia find it for you.
3DEXPERIENCE Senior Solution Architect
I3 Infotek Inc. 3.9
Akron, OH jobs
We are seeking a highly experienced 3DEXPERIENCE Senior Solution Architect to lead a greenfield PLM implementation for automotive customers. The ideal candidate will drive PLM strategy, solution architecture, and business transformation using the 3DEXPERIENCE platform, working closely with business stakeholders and technical teams to deliver scalable, future-ready PLM solutions.
Key Responsibilities
Lead greenfield implementation programs for Automotive PLM customers
Define and build the PLM roadmap in collaboration with business stakeholders
Drive requirement elicitation, analysis, and mapping to PLM solutions
Architect and deliver end-to-end PLM solutions on the 3DEXPERIENCE platform
Act as a primary customer-facing solution architect, ensuring alignment with business goals
Lead and manage Organizational Change Management (OCM) initiatives
Provide strategic guidance across engineering, manufacturing, and after-sales PLM processes
Mentor and guide technical and functional teams during implementation
PLM Process Expertise Required
End-to-end PLM lifecycle: Requirements → Engineering → Manufacturing → After Sales
Engineering processes:
Design management
BOM (EBOM/MBOM)
Requirement Management
Variant & Configuration Management
Project & Program Management
Manufacturing processes:
MBOM
Manufacturing planning and execution in PLM context
Technical Skills & Experience
Hands-on experience with 3DEXPERIENCE platform (V6 2022x or later preferred)
Strong exposure to CAD / CAM / CAE tools and integrations
Experience with Cloud migration and SaaS adoption
Familiarity with AWS cloud platform (preferred)
Strong understanding of PLM architecture, integrations, and data models
Experience & Qualifications
15+ years of experience in PLM / 3DEXPERIENCE ecosystem
10+ years of consulting experience in solution architecture and PLM implementations
Proven experience leading large-scale PLM transformation programs
Strong stakeholder management and communication skills
Ability to bridge business requirements with technical solutions
Preferred Industry Experience
Automotive or Manufacturing domain experience
Experience working with global enterprise customers
$107k-148k yearly est. 3d ago
PLM Solution Architect / Next Generation Production System Specialist
I3 Infotek Inc. 3.9
Marysville, OH jobs
we were implementing a Global Enterprise PLM (Product Lifecycle Management) system and is seeking an experienced PLM Solution Architect to support Next Generation Production System planning and execution. The role focuses on defining requirements, designing future-state PLM architecture, and supporting implementation across engineering and manufacturing domains in a global environment.
Key Responsibilities
Collaborate with business application owners and stakeholders to gather, analyze, and document requirements
Define and maintain PLM system architecture, process flows, and technical documentation
Guide and supervise setup of development, QA, and production environments
Author and configure processes within Teamcenter / 3DEXPERIENCE
Develop and execute use cases and POCs, analyze results, and recommend solutions
Work with PLM software vendors to resolve technical issues and define best practices
Manage project schedules, coordinate evaluations, and prepare status and progress reports
Participate in weekly global meetings and cross-functional discussions
Support system enhancement proposals, Jira backlog creation, and solution clarifications with offshore teams and Honda Japan
Conduct testing, user training, and change management to ensure smooth adoption
PLM Process Areas
Engineering Bill of Materials (EBOM)
Manufacturing Bill of Materials (MBOM)
Bill of Process (BOP)
Bill of Equipment (BOE)
Prototype Production
Volume / Mass Production
Required Skills & Qualifications
Strong hands-on experience with Teamcenter or 3DEXPERIENCE architecture and configuration
Proven expertise in BOP and BOE management
Solid understanding of PLM system landscapes and integration with manufacturing/production systems
Experience in process authoring within PLM platforms
Ability to create technical documentation, architecture diagrams, and process models
Strong background in requirements gathering, use case development, and POC execution
Experience using Jira for backlog and issue management
Strong project coordination and stakeholder management skills in global environments
Preferred Qualifications
Exposure to prototype and volume production processes
Experience working with global teams and offshore delivery models
Knowledge of automotive manufacturing and production systems
$85k-125k yearly est. 3d ago
Data Architect
Newfire Global Partners 3.8
Boston, MA jobs
Department
Data
Employment Type
Full Time
Location
US
Workplace type
Fully remote
Compensation
$201,300 - $236,500 / year
What You'll Do What's Required Compensation & Benefits About Care Lumen Care Lumen is a visionary healthcare technology company on a mission to connect people, care, and data, so every decision leads to better outcomes. Join a mission-driven organization where your work directly impacts patient outcomes while being supported by a culture built on five core values: being purpose-driven and user-obsessed, maintaining curiosity with accountability, building fast while learning faster, upholding uncompromising integrity, and working together to win together.
As a rapidly growing healthcare technology company, you'll have the opportunity to shape the future of clinical workflow platforms that make a real difference in healthcare delivery. Join our team where curiosity, accountability, and collaboration drive innovation while comprehensive support programs ensure you can thrive both personally and professionally.
$201.3k-236.5k yearly 12d ago
Data Analyst - REMOTE
PTP 3.9
Austin, TX jobs
PTP is a fast-growing system integrator that offers strategic Customer Experience (CX) solutions to our clients. We are looking for a Data Analyst to help us design and deliver CX solutions that provide our clients with a beautiful customer journey that achieves results. At PTP we value aptitude and creativity as well as experience. We are a diverse organization and are looking for bright, passionate and committed professionals who strive to be the best at what they do.
Responsibilities
Analyze caller behavior data from IVR applications using SQL and Microsoft Excel
Design and develop custom Tableau reports to visualize caller behavior data, identify trends and areas for improvement
Collaborate with internal and external stakeholders to identify opportunities for improving the IVR experience based on analysis findings
Develop and maintain a deep understanding of IVR application functionality and user flows
Provide data-driven insights and recommendations to inform design decisions and improve overall caller experience
Requirements
Bachelor's degree in Information Systems, Computer Science, Statistics, or related field
2+ years of experience in data analysis, preferably in an IVR or contact center environment
Experience with Tableau or other business intelligence tools is highly desirable
Proficiency in Microsoft Excel and SQL
Strong analytical and problem-solving skills
Excellent communication and collaboration skills
Desired Experience
Familiarity with IVR platforms and technologies (e.g., Genesys, Avaya, Nuance)
Knowledge of user experience (UX) design principles and human-computer interaction
$74k-95k yearly est. 60d+ ago
Distinguished Data Systems Architect, Data Engineering
Gitlab 4.3
Remote
GitLab is an open-core software company that develops the most comprehensive AI-powered DevSecOps Platform, used by more than 100,000 organizations. Our mission is to enable everyone to contribute to and co-create the software that powers our world. When everyone can contribute, consumers become contributors, significantly accelerating human progress. Our platform unites teams and organizations, breaking down barriers and redefining what's possible in software development. Thanks to products like Duo Enterprise and Duo Agent Platform, customers get AI benefits at every stage of the SDLC.
The same principles built into our products are reflected in how our team works: we embrace AI as a core productivity multiplier, with all team members expected to incorporate AI into their daily workflows to drive efficiency, innovation, and impact. GitLab is where careers accelerate, innovation flourishes, and every voice is valued. Our high-performance culture is driven by our values and continuous knowledge exchange, enabling our team members to reach their full potential while collaborating with industry leaders to solve complex problems. Co-create the future with us as we build technology that transforms how the world develops software.
An overview of this role
Join GitLab as a Distinguished Data Systems Architect to drive our strategic data platform evolution. You'll architect scalable, distributed solutions that transform how we manage and leverage data across our SaaS and self-managed deployments, supporting enterprise-scale growth and innovation.
Some examples of our projects:
Creating a unified architectural blueprint for GitLab's data ecosystem that aligns SaaS and self-managed platforms on shared patterns and standards
Designing monetizable data services and APIs with strong governance and observability to support new product offerings and revenue streams
What you'll do
Drive architectural vision for scalable, distributed data systems across SaaS and self-managed deployments, designing database solutions that balance OLTP/OLAP performance, scalability, and cost-efficiency
Establish enterprise data governance frameworks including lineage, quality controls, versioning, and compliance practices that meet regulatory requirements across global markets
Architect monetizable data services and APIs with semantic models serving internal analytics and external product offerings, enabling new revenue streams while maintaining security and performance SLAs
Create a cohesive architectural blueprint of GitLab's data ecosystem, identifying gaps against modern platforms and establishing opinionated design principles grounded in proven cloud-native patterns
Design event-driven architectures and end-to-end data lifecycle systems spanning ingestion, orchestration (Argo, Airflow, Kubernetes), transformation workflows, and unified metadata management with comprehensive observability
Partner with product and engineering leadership to embed AI-driven patterns into data infrastructure and align senior engineering leaders on common design tenets and platform standards
Transform ambiguous business challenges into strategic technical roadmaps, leading high-stakes architectural engagements where data platforms create measurable competitive differentiation
What you'll bring
Experience architecting large-scale distributed data systems in complex, regulated domains with unified platforms integrating cloud-native compute, orchestration, and semantic modeling
Demonstrated leadership building multi-modal data services with strong developer experience principles, focusing on monetization, governance, and data product lifecycle management
Hands-on expertise with modern data stack technologies including Python, Docker, Airflow, Trino, Postgres, distributed query engines, and graph-based metadata systems
Advanced knowledge bridging cloud and on-premises deployments with automation, developer self-service focus, and data integration through connector marketplaces
Deep understanding of data processing paradigms and standards including synchronous vs. asynchronous processing, schema management, logical data modeling, and formats like OpenTelemetry, OpenMetadata, and OpenLineage
Experience with AI-driven architectures and emerging technologies including model orchestration, agentic patterns, and standards like MCP (Model Context Protocol)
Strong architectural opinions on cost-aware, resilient solutions that optimize entire data lifecycle decisions with focus on scalability and performance trade-offs
Passion for open source platforms, team mentorship, and collaborative values with ability to build scalable solutions that align with organizational culture and technical excellence
Design and implement Model Driven Architecture (MDA) framework to establish clear separation between logical/conceptual data models and platform-specific physical implementations, enabling agility and reducing technical debt across enterprise data systems
About the team
Data Engineering and Monetization is a newly formed function within the Engineering Org with a mission to build a comprehensive foundation of data platforms with responsible dataarchitecture that scales.
The base salary range for this role's listed level is currently for residents of the United States only. This range is intended to reflect the role's base salary rate in locations throughout the US. Grade level and salary ranges are determined through interviews and a review of education, experience, knowledge, skills, abilities of the applicant, equity with other team members, alignment with market data, and geographic location. The base salary range does not include any bonuses, equity, or benefits. See more information on our benefits and equity. Sales roles are also eligible for incentive pay targeted at up to 100% of the offered base salary.
United States Salary Range$219,100-$328,700 USDHow GitLab will support you
Benefits to support your health, finances, and well-being
Flexible Paid Time Off
Team Member Resource Groups
Equity Compensation & Employee Stock Purchase Plan
Growth and Development Fund
Parental leave
Home office support
Please note that we welcome interest from candidates with varying levels of experience; many successful candidates do not meet every single requirement. Additionally, studies have shown that people from underrepresented groups are less likely to apply to a job unless they meet every single qualification. If you're excited about this role, please apply and allow our recruiters to assess your application.
Country Hiring Guidelines: GitLab hires new team members in countries around the world. All of our roles are remote, however some roles may carry specific location-based eligibility requirements. Our Talent Acquisition team can help answer any questions about location after starting the recruiting process.
Privacy Policy: Please review our Recruitment Privacy Policy. Your privacy is important to us.
GitLab is proud to be an equal opportunity workplace and is an affirmative action employer. GitLab's policies and practices relating to recruitment, employment, career development and advancement, promotion, and retirement are based solely on merit, regardless of race, color, religion, ancestry, sex (including pregnancy, lactation, sexual orientation, gender identity, or gender expression), national origin, age, citizenship, marital status, mental or physical disability, genetic information (including family medical history), discharge status from the military, protected veteran status (which includes disabled veterans, recently separated veterans, active duty wartime or campaign badge veterans, and Armed Forces service medal veterans), or any other basis protected by law. GitLab will not tolerate discrimination or harassment based on any of these characteristics. See also GitLab's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know during the recruiting process.
$90k-131k yearly est. Auto-Apply 11d ago
Data Migration Specialist
Intralinks 4.7
Remote
As a leading financial services and healthcare technology company based on revenue, SS&C is headquartered in Windsor, Connecticut, and has 27,000+ employees in 35 countries. Some 20,000 financial services and healthcare organizations, from the world's largest companies to small and mid-market firms, rely on SS&C for expertise, scale, and technology.
Job Description
Data Migration Specialist
Locations: Remote
Get To Know Us:
The Intralinks Alts Services team is the strategic growth lever for the company. By enabling Intralinks both existing and new to upgrade to the latest Intralinks products, you will be the tip of the spear for the companies' growth in 2026 and beyond. In this role you will be responsible for leading, directing, and providing delivery of Intralinks data projects from a variety of sources. You will act as the primary point of contact in dealing with customer historical data. You will help retrieve their historical data, transform it, and help review it with them prior to their transition into the Intralinks ecosystem.
Why You Will Love It Here!
Flexibility: Hybrid Work Model and Business Casual Dress Code, including jeans
Your Future: 401k Matching Program, Professional Development Reimbursement
Work/Life Balance: Flexible Personal/Vacation Time Off, Sick Leave, Paid Holidays
Your Wellbeing: Medical, Dental, Vision, Employee Assistance Program, Parental Leave
Wide Ranging Perspectives: Committed to Celebrating the Variety of Backgrounds, Talents and Experiences of Our Employees
Training: Hands-On, Team-Customized, including SS&C University
Extra Perks: Discounts on fitness clubs, travel and more!
What You Will Get To Do:
Work with customer subject matter experts and Intralinks project team to identify, define, collate, document, and communicate data migration requirements
Conduct deep dive data analysis of the customer current state to validate customer requirements and define the scope of the migration
Strategize and plan the entire legacy system to new Intralinks product migration considering risks, timelines, and potential impacts
Work with the customer to map legacy data to new Intralinks product.
Analyze and cleanse data where necessary
Oversee the direct migration of data, which may require unexpected adjustments to the process and schedule
Provide regular status updates to customer and Intralinks migration teams
Oversee the quality control process to ensure all data has been migrated and accounted for
Document everything from the strategies used to the exact migration processes put in place-including documenting any fixes or adjustments made
Report any issues encountered to Intralinks support
Conduct regular meetings with the product management team to prioritize and resolve issues that are critical to the success of the migration process
Develop best practices, processes, and standards to continuously improve the Intralinks data migration process
Ensure compliance with regulatory requirements and guidelines for all migrated data
What You Will Bring:
Bachelor's degree in information management systems, computer science, or related field, or 3 years of related work experience
Relevant experience in either software implementation or data migration
Exceptional attention to detail in data
Strong data skills - analysis, transformation, validation
Ability to maintain data integrity and evaluate logical cohesion during complex data transformations
Strong Excel skills (XLookups, Pivots, Data Sources, Queries)
Working knowledge of Python scripting - setting up environments, modifying, and testing code
Familiarity with operation of SQL databases and query structure
Experience working with clients as a technical resource and communicating difficult concepts
Experience working with clients to keep projects focused, on track, and on time
Thank you for your interest in SS&C! If applicable, to further explore this opportunity, please apply directly with us through our Careers page on our corporate website: ************************
#LI-Intralinks
#LI-MB3
#CA-MB
Unless explicitly requested or approached by SS&C Technologies, Inc. or any of its affiliated companies, the company will not accept unsolicited resumes from headhunters, recruitment agencies, or fee-based recruitment services.
SS&C offers excellent benefits including health, dental, 401k plan, tuition and professional development reimbursement plan.
SS&C Technologies is an Equal Employment Opportunity employer and does not discriminate against any applicant for employment or employee on the basis of race, color, religious creed, gender, age, marital status, sexual orientation, national origin, disability, veteran status or any other classification protected by applicable discrimination laws.
Salary is determined by various factors including, but not limited to, relevant work experience, job related knowledge, skills, abilities, business needs, and geographic regions.NY: Salary range for the position: 100000 USD to 110000 USD.
$76k-95k yearly est. Auto-Apply 13d ago
Data Architect - SME (Remote)
Ishpi Information Technologies 4.4
North Charleston, SC jobs
NIWC Atlantic Software Services requires
ISHPI
to provide emerging Department of Defense (DoD) and Department of Navy (DON) information technology, to design, develop, engineer, and maintain systems that will improve customer organizational efficiency.
ISHPI
provides support in the areas of Administrative Services, Applications Integration Management, Corporate Strategy Planning and Execution, Information Assurance (IA), Information Resource Management, and Information Technology (IT) Operations. Further task includes new standards engineering, prototype installation, application development, data interoperability, system design, system management and maintenance, data collection, analysis, and other management and implementation efforts in support of data translation, data mediation, and data mapping.
Responsibilities
ISHPI is seeking a DataArchitect who can leverage experience and expertise in data exploration, engineering, and ETL to explore, architect, develop, and deploy scripts for processing structured and unstructured data into usable data formats for long term storage, search, and analysis.
Qualifications
Education: Technical Training in Data Design or DatabaseArchitecture or Metadata Management Platforms.
Experience: Eighteen (18) years of hands-on experience architecting software solutions for IT projects, to include these areas: Designed and built relational databases in a data warehouse environment, which includes data design, databasearchitecture, and metadata repository creation. Performed data access analysis design, and archive/recovery design and implementation.
Key Skills:
Advanced understanding of and ability to apply principles, theories, and concepts of all-things data - databases, data integration, data governance, data science, advanced analytics & reporting, cloud-based data management, and artificial intelligence/machine learning.
The ability to understand business objectives for data and engineer business solutions that leverage current and innovative data technology capabilities and provide valuable insights to relevant stakeholders.
The ability to set architectural direction and standards while applying analytical, engineering, hardware, and software design theories to develop/maintain organizational strategy.
Experience building application and\or dataarchitecture roadmaps.
Experience designing, integrating, and managing complex applications & data solutions.
Experience working in a fast-paced, competitive information technology organization.
NIWC Specific Skills/Responsibilities:
Experience Engineer ELT pipelining for data ingest and egress processes in a Databricks Lakehouse Architecture with consistent performance and scalability using established Data Framework
Experience with Medallion Architecture utilizing Landing Zone and Bronze, Silver and Gold delta tables.
Expert experience with SPARKS, Python and Databricks SQL in ELT pipeline development to include knowledge of Databricks, Python, and PySpark
Have experience writing and tuning complex SQL queries used in the ELT development
Architecting, implementing and Enhance ELT Framework to include Structured Streaming.
Experience with the AWS preferably GovCloud, and Data Ingest/Transformation/Settings processes.
Experience with AWS Services like Relational Database Services (RDS), Data Migration Service (DMS), CloudFormation, Terraform, S3 bucket, lambda function, SNS and Event Viewer.
Architecting, implementing, and supporting software and data technology, in the cloud.
Have worked on teams that practice Agile methodologies.
Experience working with Databricks Workspace Notebooks
Perform Data Engineering on Databricks Development and Production Environments.
Understanding of programming and data engineering concepts and best practice.
Experience working with both structured, semi-structured, and unstructured data to include data parsing, transformation, schema definition, and query/analysis.
Create Standard Operation Procedure (SOP) for onboarding new application solutions.
Provide assistance and guidance on implementing current Data Management Solutions.
Provide technical advice to Data Engineering and BI Modernization Efforts.
Provide management feedback from research and investigation to latest trends in Data Management Systems.
Troubleshoot, fix, tune and enhance the end to end ELT Pipeline Processes.
Ability to work both independently and collaboratively.
Experience using GIT, JIRA/ServiceNow, and Microsoft tools (365 Office, Excel, Teams).
Consulting with leadership, peers, teammates, and business partners to appreciate current and anticipate future software and data technology needs.
Security Clearance: Must be a U.S. Citizen with an active secuirty clearance and the ability to obtain T5 eligibility
Certification: Must have at least one of the following: CCNA (Cisco Certified Network Associate Security), CySA+ (CompTIA Cybersecurity Analyst), GICSP (Global Industrial Cyber Security Professional), GSEC (GIAC Security Essentials Certification), CompTIA Security+ CE, CND (Certified Network Defender), or SSCP (Systems Security Certified Practitioner) or the ability to obtain one within 3 months of hire date.
“Ishpi Information Technologies, Inc. is an Equal Opportunity Employer. All qualified candidates will be considered without regard to legally protected characteristics.
Pay Rate:
The annual base salary range for this position is $120,000 - $130,000 . Please note that any salary information disclosed is a general guideline only. Ishpi considers factors such as (but not limited to) scope and responsibilities of the position, candidate's work experience, education/ training, key skills as well as market and business considerations when extending an offer.
Expression of Interest: By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified, you may be contacted for this and future openings.
*cj
$120k-130k yearly Auto-Apply 60d+ ago
Data Architect(ETL, SQL exp)_Columbus OH
360 It Professionals 3.6
Columbus, OH jobs
360 IT Professionals is a Software Development Company based in Fremont, California that offers complete technology services in Mobile development, Web development, Cloud computing and IT staffing. Merging Information Technology skills in all its services and operations, the company caters to its globally positioned clients by providing dynamic feasible IT solutions. 360 IT Professionals work along with its clients to deliver high-performance results, based exclusively on the one of a kind requirement.
Job Description
We are looking to fill a position for DataArchitect in ColumbusOH.
Qualifications
Develop dataarchitecture models.
Develop business process / workflow models.
Create SQL queries to support application interfaces, data transfers (ETLs), and data extracts.
Troubleshoot data-related production problems.
Provide status of work and deliverable estimate to Team Lead.
Additional Information
In person interview is acceptable.
$91k-118k yearly est. 60d+ ago
Immediate Interview for Data Architect (SQL and ETL Exp)
360 It Professionals 3.6
Columbus, OH jobs
360 IT Professionals and we are Staffing Specialist working directly with all US States and Local and Commercial clients. We are known for our IT Services, Mobile development, Web development and Cloud computing and working with clients to deliver high-performance results.
Job Description
• Develop dataarchitecture models.
• Develop data flow models.
• Develop business process / workflow models.
• Design, develop, and implement database tables/views.
• Create SQL queries to support application interfaces, data transfers (ETLs), and data extracts.
• Verify accuracy and completeness of application data validation procedures.
Additional Information
Thanks & Regards
Preeti Joshi
510-254-3300 Ext 142
preeti@)360itpro.com
$91k-118k yearly est. 60d+ ago
ETL Architect
E Pro Consulting 3.8
Columbus, OH jobs
E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website ******************
Job Description
Title : ETL Architect
Location : Columbus, OH
Type : Fulltime Permanent
Work Status : US Citizen / GC / EAD (GC)
Required Skills:
• Responsible for Architecture, Design and Implementation of Data Integration/ETL, Data Quality, Metadata Management and Data Migration solutions using Informatica tools
• Execute engagements as Data Integration-ETL Architect and define Solution Strategy, Architecture, Design and Implementation approach
• Expertise in implementing Data Integration-ETL solutions which include components such as ETL, Data Migration, Replication, Consolidation, Data Quality, Metadata Management etc. using Informatica products (e.g. Power Center, Power Exchange, IDQ, Metadata Manager)
• Responsible for Detailed ETL design, Data Mapping, Transformation Rules, Interfaces, Database schema, Scheduling, Performance Tuning, etc
• Lead a team of designers/developers and guide them throughout the implementation life cycle and perform Code review
• Engage client Architects, SMEs and other stakeholders throughout Architecture, Design and implementation lifecycle and recommend effective solutions
• experience in multiple Databases such as Oracle, DB2, SQL Server, Mainframe, etc
• Experience in Industry models such as IIW, IAA, ACORD, HL7, etc. and Insurance products (e.g. Guidewire) will be plus
Additional Information
All your information will be kept confidential according to EEO guidelines.
$92k-122k yearly est. 60d+ ago
ETL Architect
E*Pro 3.8
Columbus, OH jobs
E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website ******************
Job Description
Title : ETL Architect
Location : Columbus, OH
Type : Fulltime Permanent
Work Status : US Citizen / GC / EAD (GC)
Required Skills:
• Responsible for Architecture, Design and Implementation of Data Integration/ETL, Data Quality, Metadata Management and Data Migration solutions using Informatica tools
• Execute engagements as Data Integration-ETL Architect and define Solution Strategy, Architecture, Design and Implementation approach
• Expertise in implementing Data Integration-ETL solutions which include components such as ETL, Data Migration, Replication, Consolidation, Data Quality, Metadata Management etc. using Informatica products (e.g. Power Center, Power Exchange, IDQ, Metadata Manager)
• Responsible for Detailed ETL design, Data Mapping, Transformation Rules, Interfaces, Database schema, Scheduling, Performance Tuning, etc
• Lead a team of designers/developers and guide them throughout the implementation life cycle and perform Code review
• Engage client Architects, SMEs and other stakeholders throughout Architecture, Design and implementation lifecycle and recommend effective solutions
• experience in multiple Databases such as Oracle, DB2, SQL Server, Mainframe, etc
• Experience in Industry models such as IIW, IAA, ACORD, HL7, etc. and Insurance products (e.g. Guidewire) will be plus
Additional Information
All your information will be kept confidential according to EEO guidelines.
$92k-122k yearly est. 3h ago
Principal Data Architect
Egen 4.2
Remote
Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. If this describes you, we want you on our team.
Want to learn more about life at Egen? Check out these resources in addition to the job description.
Meet EgenLife at EgenCulture and Values at EgenCareer Development at EgenBenefits at EgenResponsibilities:
Lead the end-to-end architecture, design, and implementation of scalable Data Lakehouse solutions on Google Cloud Platform (GCP) using BigQuery, GCS, BigLake, and Dataplex
Collaborate directly with customers to understand business goals, data challenges, and technical requirements; translate them into robust architectural blueprints and actionable plans
Design and implement data pipelines supporting both real-time and batch ingestion using modern orchestration and streaming frameworks
Establish and enforce best practices for data cataloging, metadata management, lineage, and data quality across multiple systems
Define and implement data security, access control, and governance models in compliance with enterprise and regulatory standards
Serve as the technical lead for project teams - mentoring engineers, reviewing solutions, and ensuring architectural consistency across deliverables
Balance strategic architecture discussions with hands-on solutioning, POCs, and deep dives into data pipelines or performance tuning
Partner with stakeholders, cloud architects, and delivery leads to drive solution adoption, scalability, and long-term maintainability
Represent the company as a trusted technical advisor in client engagements - clearly articulating trade-offs, best practices, and recommendations
Qualifications:
8-10 years of progressive experience in Software Engineering and Data Platform development, with 5+ years architectingdata platforms on GCP and/or Databricks
Proven hands-on experience designing and deploying Data Lakehouse platforms with data products and medallion architectures
Strong understanding of data ingestion patterns (real-time and batch), ETL/ELT pipeline design, and data orchestration using tools such as Airflow, Pub/Sub, or similar frameworks
Expertise in data modeling, storage optimization, partitioning, and performance tuning for large-scale analytical workloads
Experience implementing data governance, security, and cataloging solutions (Dataplex, Data Catalog, IAM, or equivalent)
Excellent communication and presentation skills - able to confidently engage with technical and non-technical stakeholders and guide clients through solution decisions
Demonstrated ability to lead by example in mixed teams of engineers, analysts, and architects, balancing architectural vision with hands-on delivery
Nice to have: Experience with Databricks (Delta Lake, Unity Catalog) and hybrid GCP-Databricksdataarchitectures
Strong problem-solving mindset, curiosity to explore new technologies, and ability to “zoom out” for architecture discussions and “zoom in” for code-level troubleshooting
Compensation & Benefits:
This role is eligible for our competitive salary and comprehensive benefits package to support your well-being:- Comprehensive Health Insurance- Paid Leave (Vacation/PTO)- Paid Holidays- Sick Leave- Parental Leave - Bereavement Leave- 401 (k) Employer Match- Employee Referral Bonuses
Check out our complete list of benefits here - >********************************
Important: All roles are subject to standard hiring verification practices, which may include background checks, employment verification, and other relevant checks.
EEO and Accommodations:
Egen is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Egen will also consider qualified applications with criminal histories, consistent with legal requirements. Egen welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
$82k-112k yearly est. Auto-Apply 54d ago
Need for Data Warehouse Architect @ Columbus, OH on W2
Xperttech 3.8
Columbus, OH jobs
Job Title: Project Lead DataArchitect Duration: 12+ Months 1. COMPLETE SKILL MATRIX: 1. DataArchitecture 2. Leading teams 3. Data integration/Provisioning 4. DW architecture 5. Lead Architecture 6. PLUS - Big DataArchitecture solutions
7. PLUS - Retirement Solutions Business Acumen
Description:
Project Lead DataArchitect to fulfill a contract opportunity. He or she will execute an integrated data solution to support a strategic initiative involving enterprise data solutions. Responsibilities include providing architecture oversight and governance through formal and informal interactions with the project technical teams.
Additional Responsibilities:
1. Complete work products for project architect, transition work to technical leads, participate in design and implementation, capture technical debt.
2. The resource will be working across multiple teams creating and/or documenting architecture work products. This will include solution approach documents, solution architecture documents, architecture decisions, and application roadmaps.
3. To complete this activity will require the resource to participate in or lead sessions to understand the business requirements and facilitate architecture design sessions to communicate potential solutions.
4. Responsible for the architecture, design and implementation of an end-to-end solution for a project or a program, supporting a line of business or infrastructure domain.
5. Serves as expert in IT systems architecture.
6. Undertakes complex projects requiring additional specialized technical knowledge.
7. Makes well thought-out decisions on complex or ambiguous IT architecture issues.
8. Coordinates with users to determine requirements.
9. Ensures that system improvements are successfully implemented and monitored to increase efficiency.
10. Establishes and communicates common goal and direction for team.
11. Acts as a resource for direction, training, and guidance for less experienced staff.
12. Monitors projects schedules and costs.
Requirements:
1. 8 years of DataArchitecture experience
2. 8 years of experience leading teams
3. 8 years of data integration/Provisioning experience
4. 8 years of experience in DW architecture
5. 8 years of Lead Architecture experience
Desired:
1. 3 years of Big DataArchitecture solutions
2. Retirement Solutions Business Acumen
3. Influencing others
Additional Information
$85k-122k yearly est. 3h ago
Data Architect or Data Modeller
Devcare Solutions 4.1
Columbus, OH jobs
As a support to the Data Science and Decision Analytics teams within the Enterprise Data & Analytics Organization, DataArchitects will produce multi-purpose pre-prepared modeling data structures. These structures will allow the Data Science team to construct lead-generation models in an expedited fashion along with confirmation that the data lineage and definitions are sound. DataArchitects will also help facilitate the delivery of leads as output against these pre-prepared data structures. Additionally, will support data strategy and requirements building on behalf of the Enterprise Data & Analytics function.
$84k-121k yearly est. 60d+ ago
Principal Enterprise Data Architect
Smarsh 4.6
Remote
Who are we? Smarsh empowers its customers to manage risk and unleash intelligence in their digital communications. Our growing community of over 6500 organizations in regulated industries counts on Smarsh every day to help them spot compliance, legal or reputational risks in 80+ communication channels before those risks become regulatory fines or headlines. Relentless innovation has fueled our journey to consistent leadership recognition from analysts like Gartner and Forrester, and our sustained, aggressive growth has landed Smarsh in the annual Inc. 5000 list of fastest-growing American companies since 2008.
Summary
We are seeking a highly experienced enterprise dataarchitect to own and drive our company's dataarchitecture, data model, master data governance and analytics platform. You will be responsible for defining the enterprise data model and entity relationships across our business systems (including COTS ERPs such as Salesforce, NetSuite and our proprietary telemetry/usage systems and internal software) and for governing master data and enterprise metadata. You will lead the architecture and implementation of our data lake / data-warehouse environment (e.g., Snowflake) and be an expert in leveraging tools (including AI offerings such as Cortex or other advanced AI platforms). You will also own enterprise operational and financial reporting and analytics, leading a team to deliver high-quality insights and visualizations to drive business decisions. This is a strategic role in a fast-growing SaaS company (~$500M revenue) and you will be a key leader for data-driven growth. How will you contribute?
Define, maintain and evolve the Enterprise Data Model and entity-relationship architecture across all business systems: CRM/ERP (Salesforce), ERP (NetSuite), Atlassian tools (Jira/Confluence), proprietary product telemetry/usage data, as well as other internal systems.
Own master data governance: establish policies, standards, ownership/responsibility, definitions, data quality metrics, and lifecycle management for master entities (e.g., Customer, Account, Product, Usage, Finance, Vendor, Employee) and establishing “golden data”.
Lead and manage the data lake / data warehouse architecture, with primary platform Snowflake and supporting data-ingestion / ETL/ELT pipelines, streaming usage data, telemetry, business system feeds, and operational/financial data.
Stay on the cutting-edge of AI, SQL, analytics: evaluate and adopt AI-SQL tools (e.g., Cortex or similar), support augmented analytics, facilitate self-service data/analytics for business users, and embed AI/workbench capabilities into the data platform.
Own and lead a team responsible for enterprise operational and financial reporting & analytics: define KPIs, build dashboards, develop visualization strategy (e.g., Power BI, Tableau, Looker), ensure data accuracy, timeliness and scalability.
Partner cross-functionally with business systems owners (Salesforce, NetSuite, product telemetry, Atlassian), IT, data engineering, finance, sales, operations and product leadership to ensure alignment of data strategy with business strategy.
Enforce data governance, metadata management, data lineage, usage tracking, data cataloging, and ensure compliance with applicable standards/regulations (e.g., GDPR, CCPA, SOX where applicable).
Define and monitor data quality, data security, access controls, and ensure platform scalability, performance and reliability.
Provide thought leadership on dataarchitecture best practices, mentor senior data & analytics engineers, promote a data-driven culture across the company.
Develop roadmaps for dataarchitecture, analytics platform evolution, cost control and performance optimization.
What will you bring?
10-15+ years' experience in dataarchitecture / enterprise data management roles (in mid-to large organizations).
Proven experience working in a SaaS software company (ideally ~$200M-$1B revenue) or similar high-growth technology company.
Deep experience architecting enterprise data models and entity-relationship designs across multiple systems and domains (customer, product, usage/telemetry, finance, operations).
Hands-on and strategic experience with data lake / data warehouse platforms-experience with Snowflake is strongly required.
Experience ingesting and integrating data from CRM systems (Salesforce), ERP systems (NetSuite), Atlassian tools (Jira/Confluence), and product telemetry / usage data systems.
Strong experience in master data management (MDM) and governance, metadata management, data quality frameworks, data cataloging and data lineage.
Expertise in analytics and visualization leadership: delivering enterprise reporting, dashboards, KPI frameworks, business partner engagement.
Experience evaluating and deploying AI/SQL tools, augmented analytics and working with data science/AI teams to operationalize insights (experience with Cortex or comparable tools is a plus).
Excellent leadership, communication and stakeholder management skills: able to translate business needs into dataarchitecture and analytics solutions and lead a team.
A Bachelor's Degree in Computer Science, Information Systems, Data Science, or related field (Master's preferred) and relevant certifications (e.g., CDMP, TOGAF, or Snowflake certifications) are helpful.
Strong understanding of security, compliance and data governance in enterprise contexts.
About our culture
Smarsh hires lifelong learners with a passion for innovating with purpose, humility and humor. Collaboration is at the heart of everything we do. We work closely with the most popular communications platforms and the world's leading cloud infrastructure platforms. We use the latest in AI/ML technology to help our customers break new ground at scale. We are a global organization that values diversity, and we believe that providing opportunities for everyone to be their authentic self is key to our success. Smarsh leadership, culture, and commitment to developing our people have all garnered Comparably.com Best Places to Work Awards. Come join us and find out what the best work of your career looks like.
$108k-150k yearly est. Auto-Apply 43d ago
Workday Data Conversion Consultant
Erp Analysts 4.3
Remote
It's fun to work in a company where people truly BELIEVE in what they're doing!
We're committed to bringing passion and customer focus to the business.
Are you looking to join a dynamic company that truly values their employees, offers great benefits, and has a “people first” culture? At ERPA, we encourage our employees to be innovative and welcome new ideas. Empathy, responsibility, passion, and agility are the values that ERPA emulates in the workplace and seeks in our employees.
ERPA is a client-centered technology services firm, modernizing and maximizing customer's investment in Workday. Our team delivers holistic solutions for customers looking to maximize their Workday investment and elevate the user experience by offering on-going application management services, Phase X and follow-on solutions, analytics, and overall optimization.
Position Summary:
ERPA is seeking a Sr. Workday Data Consultant who is passionate about helping Workday clients achieve, visualize, and quantify their software investment. Helping to develop and a team of top-notch Workday Data Consultants with innovative ideas to disrupt the market. If you're interested in the cutting edge of Workday, we're interested in you!
The Senior Workday Data Consultant will work remotely and will be responsible for supporting data conversion and reporting for Phase X and post-production projects. This role will actively contribute to the development of ERPA's Workday AMS practice and should showcase innovation, strategic thinking and have the drive to make ERPA a Workday partner of choice.
Key Responsibilities:
Act as a lead consultant on multiple client engagements with limited direction
Lead, build, validate and conduct data conversion and strategy for third-party systems into Workday.
Design, architect, develop, and deploy functional business process, data conversion and reporting solutions.
Understand client business requirements and provide guidance throughout design, configuration and prototype, and assist clients with testing and move to production efforts
Partner with Engagement Managers to keep them informed of project status, changes, etc.
Collaborate with cross-functional counterparts to ensure clear lines of communication and project alignment
Accurately maintain forecast in a timely manner
Assist Sales team with orals, demos, and LOEs
Stay up to date on industry knowledge, Workday enhancements, and be able to advise on Workday best practices Provide Exceptional Customer Service to build lasting relationships with clients
Experience and Education Requirements:
Workday Partner HCM certification is required
Preferred Workday Partner certifications: Data Conversion and Reporting
2+ years of consulting experience leading, building, validating and conducting data conversion in Workday for multiple functional areas, such as HCM, Payroll, Financials, Time Tracking, etc.
2+ years of Workday reporting experience required, including experience gathering business requirements, leading design sessions, configuring, testing, and move to production activities
2+ years of experience troubleshooting and navigating Workday Security
Previous AMS/Post-Production Support experience preferred
Experience with ticketing systems such as Jira, ServiceNow, SalesForce, ZenDesk, etc. required
Must demonstrate a detailed understanding of Workday reporting processes and best practices.
Experience in Prism, and/or Peakon strongly preferred.
Must be proficient with SQL. Will also consider candidates that are proficient with SSMS, Postgres and/or Access
Demonstrate strong, agile communication and presentation skills in order to adapt to various audiences
Advanced knowledge of Microsoft Office Suite, specifically Microsoft Excel and PowerPoint
Must demonstrate exceptional customer service skills
Nothing in this job description restricts management's right to assign or reassign duties and responsibilities of this job.
Applicants are considered for all positions in accordance with statutes and regulations concerning non-discrimination on the basis of race, ancestry, age, color, religion, sex, national origin, sexual orientation, gender identity, non-disqualifying disability, veteran status, or other protected classification.
ERPA is an equal opportunity employer, as well as a substance and tobacco free workplace. All offers of employment are contingent on successfully passing the pre-employment drug screen and background investigation which may include reference checks, criminal background investigation, and when applicable licensing verification.
Applicants must be legally authorized to work in the United States on a full-time basis. We will not consider any applicants that require sponsorship for employment visa status either now or in the future.
If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!
$76k-101k yearly est. Auto-Apply 60d+ ago
Hiring Experienced Data Analyst to work in USA
Cygnus Professionals 3.2
Ohio jobs
Headquartered in New Jersey (U.S), Cygnus Professionals Inc. is a next generation global information technology Solution and Consulting company powered by strong management and leadership team with over 30 person years of experience. We strive to extend our presence across industries and geographies with our industry-focused business excellence.
Cygnus's vision is to become global leader in Information technology and consulting by delivering excellence to its customers. We understand that we cannot achieve it without our people. Hence, they are the most integral part of our organization. People at Cygnus are committed to help their customers in achieving their goals. Our people exhibit the sense of ownership in each step while serving their customers. We at Cygnus possess strong value system which is the core of our organization. It helps us stay ahead in the evolution curve and help us retain quality across the value chain.
Job Description
We are inviting candidates to join our IT Solutions Division or provide onsite services to our customers. Looking for candidates to hire in following technologies.
· Java
· UI /UX Designer
· ETL/DWH
· Web Development
· Network Engineer
· SharePoint
· Project Management
· Business Analyst
· Data Analyst/Analytics/SAS/R/Data Scientist/Analytics
· IOS/Android
· SQL BI
· QA(Automation, Manual)
· Big Data Analytics
· Hadoop
· Salesforce Developer
· .Net
· Oracle DBA
· Tableau Report Developer
· Qlikview Report Developer
Qualifications
Bachelors and Masters
$64k-83k yearly est. 60d+ ago
Analytics Data science and IOT Lead
Hexaware Technologies 4.2
Remote
We are looking for a highly skilled Machine Learning Engineer to design, build, and deploy scalable machine learning solutions. This role requires strong software engineering skills, deep understanding of ML algorithms, and the ability to take models from experimentation to production. You will partner closely with data engineers, data scientists, and product teams to deliver impactful, data-driven outcomes.
Key Responsibilities: -
Model Development & Optimization
Build, train, and optimize ML and deep learning models for real-world applications.
Implement feature engineering, data preprocessing, and model refinement.
Conduct rigorous experimentation using modern ML frameworks.
Productization of ML Models
Deploy models to production using MLOps practices.
Develop scalable APIs or batch pipelines for model inference.
Implement model versioning, automated retraining, and monitoring.
Data Engineering Collaboration
Work closely with data engineering teams to develop robust data pipelines. •
Ensure data quality, consistency, and availability for ML workloads.
Optimize data workflows for performance and scalability.
System Design & Architecture
Contribute to architecture of ML systems, including storage, orchestration, and serving layers.
Evaluate infrastructure needs, including GPUs, feature stores, and CI/CD.
Model Monitoring & Governance
Monitor models for drift, degradation, and bias.
Create dashboards, alerts, and automated evaluation procedures.
Ensure compliance with data privacy and responsible AI guidelines.
Cross-functional Communication
$78k-103k yearly est. Auto-Apply 14d ago
Big Data Lead
Hexaware Technologies 4.2
Remote
JD:
Primary:
Azure,
Databricks,ADF,
Pyspark/Python
Secondary:Datawarehouse,
Must
Have
•
12+
Years
of
IT
experience
in
Datawarehouse
and
ETL
•
Hands-on
data
experience
on
Cloud
Technologies
on
Azure,
ADF,
Synapse,
Pyspark/Python
•
Ability
to
understand
Design,
Source
to
target
mapping
(STTM)
and create specifications documents • Flexibility to operate from client office locations • Able to mentor and guide junior resources, as needed Nice to Have • Any relevant certifications • Banking experience on RISK & Regulatory OR Commercial OR Credit Cards/Retail