Data warehousing specialist jobs near me - 107 jobs
Let us run your job search
Sit back and relax while we apply to 100s of jobs for you - $25
Entry-Level Python Data Warehousing Specialist
Arsenault
Remote data warehousing specialist job
Arsenault is looking for a motivated entry-level python datawarehousingspecialist to join our growing development team. This position is ideal for someone with some programming experience and excellent communication skills who is interested in gaining experience with software development and datawarehousing. This is currently a work from home position.
As a member of the quality assurance team, you will maintain rules written in Python which map business data to our production database using an in-house datawarehousing tool. Other responsibilities include working with the database, archiving raw data, pushing and pulling data from data stores.
Requirements:
1 year of experience in software development, preferably with Python
Excellent written and spoken English
Familiarity with Excel
Familiarity with FTP/SFTP
Familiarity with Amazon S3
Nice to have:
Web scraping experience
Experience with Django or Flask
About Us
Arsenault Inc. is a digital eCommerce furniture retailer specializing in interior design and home decor. We work with more than 200 brands and offer over 400,000 items to a national customer base, but we are a small business with a dedicated team, and each employee has a direct impact on the success of the company.
$70k-95k yearly est. 60d+ ago
Looking for a job?
Let Zippia find it for you.
Full-Stack Developer, Workflow and Data Visualization
Nvidia 4.9
Remote data warehousing specialist job
NVIDIA's Silicon Solutions Group, Efficiency (SSGE) team is seeking a full-stack developer to help develop and maintain automated workflows, build robust ETL pipelines, and create dynamic, interactive dashboards that empower better business decisions. You will play a key role in transforming raw datasets into clean, reliable information and visual experiences that drive insight and action across the organization. You will partner closely with other teams to define intuitive, robust solutions using modern frameworks and technologies. At NVIDIA, we strive for perfection, encourage innovation, and provide opportunities to explore new ways to succeed!
What you'll be doing:
Compose, implement, and optimize ETL pipelines to collect, clean, and prepare data from different sources.
Build end-to-end web applications that visualize data and streamline workflows.
Design and develop APIs and backend services that integrate data with front-end dashboards and tools, ensuring seamless integration and functionality.
Collaborate with internal teams and stakeholders to understand their workflows and methodologies and transform them into technical solutions.
Monitor, debug, and maintain data infrastructure for accuracy, speed, and reliability.
Ensure scalability, performance, and security across applications and data pipelines.
What we need to see:
MS or equivalent experience, 5+ years of full-stack development experience, with proven skill in both front-end and back-end frameworks (e.g., React, Vue, Node.js, Django, or Flask).
Experience building and maintaining ETL pipelines using tools such as Airflow, Prefect, Dagster, or custom Python-based solutions.
Proven skills in Python, SQL, and working with relational or NoSQL databases.
Familiarity with data visualization libraries (e.g., Power BI, Grafana).
Understanding of data modeling, API design, and workflow automation.
Ability to translate complex technical concepts into intuitive, user-friendly solutions.
Excellent problem-solving skills, attention to detail, and collaborative mindset.
Ways to Standout from the crowd:
Experience working with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes).
Experience implementing real-time or near-real-time data streaming solutions (Kafka, Kinesis, etc.).
Familiarity with version control (Git), CI/CD pipelines, and agile development practices.
Familiarity with UI/UX design principles and a passion for clean, intuitive interfaces.
Background with using AI tools (e.g., Cursor, Copilot) in automation development.
NVIDIA is widely considered to be one of the world's most desirable employers in the technology field. We have some of the most forward-thinking and hardworking people in the world working for us. If you're creative and autonomous, we want to hear from you!
#LI-Hybrid
Your base salary will be determined based on your location, experience, and the pay of employees in similar positions. The base salary range is 152,000 USD - 218,500 USD for Level 3, and 184,000 USD - 287,500 USD for Level 4.
You will also be eligible for equity and benefits.
Applications for this job will be accepted at least until January 17, 2026.
This posting is for an existing vacancy.
NVIDIA uses AI tools in its recruiting processes.
NVIDIA is committed to fostering a diverse work environment and proud to be an equal opportunity employer. As we highly value diversity in our current and future employees, we do not discriminate (including in our hiring and promotion practices) on the basis of race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, disability status or any other characteristic protected by law.
$109k-137k yearly est. Auto-Apply 10d ago
Data Architect- REMOTE
Catasys Health 4.1
Remote data warehousing specialist job
Catasys is making a positive impact on people's lives every day. We use predictive analytics to identify health plan members with unaddressed behavioral health conditions that worsen chronic disease, then engage, support and guide these members to better health with a personalized, human-centered approach. This has led us to where we are today: growing fast and saving lives as we do.
To support our explosive growth, we're looking for compassionate, hard-working people-lovers to join our team. If innovating in the field of patient care is something you're passionate about, we encourage you to join our mission to improve the health and save the lives of as many people as possible.
Impact lives in so many ways
You'll be an integral part in supporting people coping with their unique life challenges. Every member of the Catasys team contributes to accomplishing our goals and upholding our people-centric values.
The new face of mental health
Our model is research-based, and we are invested in staying on the leading edge of treatment. You'll help us break down barriers and stigmas associated with mental health.
Career options
Our ongoing strong growth and evolution, we are looking for people who want to do their best at work. Join our team and take your career to the next level with Catasys. We are committed to promoting from within.
Excellent compensation
Job Description
In this key role, you will be a data architect
defining and growing the data infrastructur
e and data supported by applications used by Catasys colleagues who work daily to improve and save the lives of those suffering from the medical consequences of untreated behavioral health conditions.
You will work in a highly autonomous and low supervision environment with a geographically distributed team creating and delivering successful applications from whiteboard to market scale. You will live in a highly collaborative, delivery-focused team environment and will be equally at home designing an MDM strategy, troubleshooting DB performance, building out a data strategy or helping a developer implement a new document collection.
You will
display the ability to be a critical thinker
and tackle problems by first evaluating the problem and then thinking of several potential solutions. You will
be responsible for all phases of new data solution implementation
displaying the ability to lead in the delivery of the solutions.
To you, balancing security and accessibility, system design and architecture, reliability engineering and fault diagnosis are not esoteric terms, but rather fuel for the obsession that drives your daily war against mediocrity. Basically, your qualification for this job is proven experience operating in a high performing and fault intolerant environment. Your objective is to anticipate the needs of the organization and work to ensure that the data architecture provides value for the entire organization
Qualifications
Bachelor's degree in Computer Science or "STEM" majors (science, technology, engineering, or math)
5 or more years of data architecture experience
Excellent communication both written and verbal
Experience with relational and No SQL data structures
Experience with Data Lakes and technologies
Experience in deploying a Master Data Management (MDM) solution
Familiar with Python scripting language
Experience with datawarehouse implementations
Data visualization experience
Additional Information
All your information will be kept confidential according to EEO guidelines.
$110k-154k yearly est. 2d ago
Data Architect
Decisiveinstincts
Remote data warehousing specialist job
DecisiveInstincts is hiring a Data Architect (SME). This role is remote, but candidates must be within 50 Miles of Clarksburg, WV. Secret Clearance is required.
Responsibilities:
Participates in planning, definition, and high-level design of data solutions, exploring alternatives and evaluating new technologies.
Develops and maintains scalable cloud-based data infrastructure, ensuring alignment with the organization's decentralized data management strategy.
Designs and implements ETL pipelines using AWS services (e.g., S3, Redshift, Glue, Lake Formation, Lambda) to support data domain requirements and self-service analytics.Procurement Sensitive- See FAR 2.101 and 3.104
Collaborates with data domain teams to design and deploy domain-specific data products, adhering to organizational standards for schema design, data transformations, and storage solutions.
Establishes and enforces data governance practices, including compliance with data
privacy, access controls, and lineage requirements, across the organization's data assets.
Maintains a comprehensive data catalog and governance tools, automating workflows to uphold data integrity and ensure discoverability across all domains.
Leads the implementation and ongoing support of data mesh architecture, ensuring seamless integration and data flow across multiple domains.
Ensures the availability and reliability of the organization's Data Mesh software, managing
operations and maintenance of the framework.
Provides guidance and leadership on the cultural adoption of data mesh principles, including organizational training initiatives for data stewards and users.
Communicates effectively with stakeholders to align technical data architecture with business goals, ensuring long-term support for analytics, insights, and innovation.
Experience working in an Agile organization using Scrum, Kanban, Jira, Confluence, and SAFe.
Provide team specific training as needed.
Requirements:
Secret Clearance
Minimum of 10 years' experience recommended. In absence of years of experience, certifications or past work may be used to show the level of experience needed to perform at this level.
Any of these certifications are preferred - AWS Certified Data Analytics - Specialty, AWS Certified Solutions Architect - Professional, DAMA Certified Data Management Professional (CDMP), Google Professional Data Engineer (cross-cloud expertise), Informatica Data Governance Specialist (or equivalent tool-based cert), SAFe Agilist (SA), Certified Information Privacy Professional (CIPP/US) (for data privacy governance)
Minimum of 8 years' experience designing and implementing data architectures, including data lakes, warehouses, and data mesh frameworks in cloud environments.
At least 5 years' experience working with AWS data services (S3, Redshift, Glue, Lake Formation, Lambda) for scalable, cloud-native data solutions.
Proven leadership in data governance, including privacy, access control, and metadata management across decentralized data domains.
Experience managing and maintaining data catalogs and automating governance workflows using specialized tools.
Ability to lead organizational adoption of data mesh principles through training initiatives, stakeholder engagement, and cultural transformation.
Strong collaboration skills to align technical data architecture with business objectives and ensure interoperability across systems.
Experience overseeing the operations and maintenance of enterprise Data Mesh software, ensuring reliability and scalability for analytics and innovation
ActioNet is a CMMI-DEV Level 4, CMMI-SVC Level 4, ISO 20000, ISO 27001, ISO 9001, HDI-certified, woman-owned IT Solutions Provider with strong qualifications and expertise in Agile Software Engineering, Cloud Solutions, Cyber Security and IT Managed Services. With 26+ years of stellar past performance, ActioNet is the premier Trusted Innogrator!
Core Capabilities:
Advanced and Managed IT Services
Agile Software Development
DevSecOps
Cybersecurity
Health IT
C4ISR & SIGINT
Data Center Engineering & Operations
Engineering & Installation
Why ActioNet?
At ActioNet, our Passion for Quality is at the heart of everything we do:
Commitment to Employees: We are committed to making ActioNet a great place to work and continue to invest in our ActioNeters.
Commitment to Customers: We are committed to our customers by driving and sustaining Service Delivery Excellence.
Commitment to Community: We are committed to giving back to our community, helping others, and making the world a better place for our next generation.
ActioNet is proud to be named a Top Workplace for the twelfth year in a row (2014 - 2025). We have a 98% customer retention rate. We are passionate about the inspirational missions of our customers, and we entrust our employees and teams to deliver exceptional performance to enable the safety, security, health, and well-being of our nation.
What's in It For You?
As an ActioNeter, you get to be part of an exceptional team and a corporate culture that nurtures mutual success for our customers, employees, and communities. We give you the tools to be successful; all you need to do is bring your best ideas, your energy, and a desire to develop your skills, experience, and career. Are you ready to make a difference?
ActioNet is an equal-opportunity employer and values inclusion at our company. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Full-Time Employees are eligible to participate in our ActioNet's Benefits Program:
Medical Insurance
Vision Insurance
Dental Insurance
Life and AD&D Insurance
401(k) Savings Plan
Education and Professional Training
Flexible Spending Accounts (FSA)
Employee Referral and Merit Recognition Programs
Employee Assistance and Identity Theft Protection
Paid Holidays: 11 per year
Paid Time Off (PTO)
Disability Insurance
ActioNet is an equal opportunity employer and value inclusion at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
********Direct Applicants, only. No Agencies, No third-party recruiters, please********
$97k-132k yearly est. Auto-Apply 60d+ ago
Data Architect - Azure, Databricks
Ness Digital Engineering
Remote data warehousing specialist job
at Ness Canada, Inc.
Job Description: Data Architect (Azure & Databricks Technology Stack)
We are seeking a highly skilled Data Architect with deep expertise in Azure, Databricks, SQL, and Data Modeling. The ideal candidate will have extensive experience in both traditional datawarehouse architectures and modern data platform paradigms (Data Lake, Lakehouse, and Azure Synapse). This role requires a proven track record in designing and implementing enterprise-scale data solutions, integrating multiple data sources, and enabling analytics at scale.
Key ResponsibilitiesArchitecture & Design
Lead the architecture, design, and implementation of enterprise datawarehouses, data lakes, and lakehouses on Azure and Databricks.
Define and enforce data modeling standards, best practices, and guidelines across transactional and analytical workloads.
Architect end-to-end modern data platforms leveraging Azure Synapse, Databricks, Delta Lake, and related Azure services (ADF, ADLS, Purview, etc.).
Solution Delivery
Design scalable ETL/ELT pipelines and orchestrate workflows for ingestion, transformation, and consumption.
Partner with business stakeholders, product teams, and data engineers to deliver high-quality, business-driven data solutions.
Ensure solutions are secure, performant, and compliant with enterprise governance and regulatory standards.
Strategy & Leadership
Define data strategy and reference architecture for modernization from legacy/traditional data platforms (Teradata, Oracle, SQL Server, etc.) to modern architectures.
Provide technical leadership and mentoring to engineering teams.
Collaborate with cloud, analytics, and business teams to align data architecture with organizational goals.
Required Skills & ExperienceCore Expertise
Proven experience as a Data Architect or Senior Data Engineer/Lead designing and implementing enterprise datawarehouses and modern data platforms.
Hands-on expertise with Azure services: Azure Synapse Analytics (Dedicated SQL Pools, Serverless Pools), Azure Data Factory (ADF), Azure Data Lake Storage (ADLS Gen2), Azure Purview.
Strong expertise with Databricks: Delta Lake, Lakehouse, Spark (PySpark/Scala), MLflow.
Very strong SQL development and optimization skills.
Strong data modeling (3NF, Dimensional, Data Vault, Canonical Models, etc.).
Complementary Skills
Knowledge of traditional data platforms (Oracle, Teradata, SQL Server, Informatica, etc.) along with modern ELT/ETL frameworks.
Solid understanding of data governance, metadata management, data quality, and security.
Experience in building real-time/streaming data pipelines (Kafka, Event Hub, etc.) is a plus.
Soft Skills
Excellent communication and ability to engage with business SMEs and senior stakeholders.
Strong analytical and problem-solving skills with an enterprise-scale mindset.
Experience in mentoring and guiding teams on modern data practices.
Qualifications
Bachelor's/Master's degree in Computer Science, Information Systems, or related field.
15+ years of experience in Data Engineering/Architecture, with at least 5+ years in Azure & Databricks ecosystem.
Proven track record in enterprise-scale datawarehouse and lakehouse implementations.
Certifications in Azure Data Engineer/Architect or Databricks preferred.
$97k-132k yearly est. Auto-Apply 60d+ ago
Principal Data Architect
Tidal Wave Auto Spa
Remote data warehousing specialist job
Tidal Wave Auto Spa is one of the fastest growing car wash chains in the country and is a recognized leader in the industry with locations nationwide. Our wave of success began in 2004 in the small town of Thomaston, GA, which is where Tidal Wave Headquarters calls home. Tidal Wave Auto Spa is a national brand that is forecasted to grow at a rapid rate for years to come, so we are aggressively pursuing individuals with exceptional talent and leadership qualities. Our goal is to redefine the car wash industry with the latest technology, top-notch friendly service, and unwavering dedication to its employees!
POSITION SUMMARY
Tidal Wave Auto Spa is seeking a Principal Data Architect who will design, own, and evolve the centralized data architecture that powers the entire company's intelligence, reporting, operational visibility, and future technology capabilities.
This role is the keystone technical position within the data organization.
You will architect and build an enterprise-grade, centralized data ecosystem used by Operations, Marketing, Finance, Executive Leadership, and our Data Partner team. Your work will enable scalable growth, data accuracy, real-time visibility, multi-entity modeling, and high-performance analytics across thousands of employees and millions of transactions.
This position blends:
Data Architecture
Software Engineering
DataWarehousing
Backend Engineering
Systems Integration
Performance Optimization
Data Governance
You must be able to build end-to-end systems - from raw ingestion to dimensional modeling, from centralized data architecture to real-time analytics, from schema design to repeatable validation frameworks.
This role sits at the foundation of data reliability, speed, and trust across the enterprise.
Every insight and every operational decision relies on the systems you build.
WHAT YOU WILL OWN
1. Centralized Enterprise Data Architecture (End-to-End Ownership)
Architect, build, and maintain the company's centralized datawarehouse used for all analytics.
Implement dimensional and semantic models based on established architectural and modeling frameworks.
Build secure, multi-entity data structures to support segmentation and controlled access
Own schema design, data lineage, metadata strategy, and governance standards
Establish modeling, naming, and data quality practices for the entire organization
2. Collaboration With Data Partners (KPI Logic & Requirements)
Partner with Data Partners to define KPI logic, business rules, and metric consistency
Translate business requirements into scalable technical architecture
Validate that models support analytical needs across Operations, Finance, Marketing, and Executive leadership
Ensure the architecture anticipates and supports future reporting, analytics needs, and software solutions
Your work enables the Data Partners to transform data into insights with speed, accuracy, and confidence.
3. Data Engineering Leadership
Lead the Data Engineer(s), providing technical direction, standards, and mentorship
Own ETL/ELT workflows, orchestration patterns, error-handling, and monitoring
Build scalable ingestion frameworks for high-volume, high-frequency data
Develop high-performance transformation logic optimized for speed and reliability
4. Integration & Source System Engineering
Integrate systems via APIs, webhooks, streaming events, and secure batch processes
Create resilient, auditable ingestion frameworks with fault tolerance
Reconcile and validate data across all systems with automated checks
Handle structured, semi-structured, and unstructured data with ease
5. Software & Engineering Craftsmanship
You must be fluent and deeply experienced in:
C#/.NET, Python, and similar languages
Expert SQL (tuning, indexing, partitioning, caching, compression)
API design and consumption
Cloud datawarehousing (Snowflake strongly preferred)
DevOps, CI/CD pipelines, Git, and deployment automation
Real-time and near-real-time data delivery patterns
Your code must be optimized, elegant, maintainable, and fast - capable of handling complex queries in under a second.
6. Reporting Foundation & Semantic Modeling
Build and maintain the enterprise semantic layer used by BI developers and Data Partners.
Build models that balance performance, clarity, and business usability
Support both on-demand and real-time reporting scenarios
Ensure data structures scale as the business grows
7. Data Governance & Validation Engineering
Build and enforce enterprise data standards and documentation
Define rules, logic, and source-of-truth structures for all major KPIs
Implement automated data validation and reconciliation tools
Establish continuous monitoring and anomaly detection systems
8. High-Performance System Optimization
Deliver sub-second query performance on highly complex datasets
Select optimal storage strategies, compute sizing, and cost-efficient warehouse design
Continuously refine models, transforms, and query patterns
9. Strategic System Architecture for Future Growth
Design architectures that scale with rapid expansion
Build advanced frameworks such as householding/entity resolution
Support fraud detection, customer segmentation, and operational optimization logic
Enable write-back frameworks where curated data feeds operational systems
Develop systems capable of powering advanced analytics and machine learning
WHAT GREAT LOOKS LIKE
You are someone who:
Has built centralized data architectures for high-growth companies
Knows data theory deeply, but prioritizes real-world performance
Writes code and SQL that is fast, clean, and optimized to near perfection
Can architect large-scale systems with clarity and elegance
Has an entrepreneurial mindset and treats architecture like a product
Thrives in extreme ownership - everything that touches data quality, performance, or design is your responsibility
Operates with precision, discipline, and craftsmanship
Is energized by complexity and can see the entire data ecosystem end-to-end
Enjoys collaborating with cross-functional teams and elevating analysts
Is rigid in holding the organization accountable to our data governance standards
KEY RESPONSIBILITIES
Architecture & Engineering
Architect centralized, scalable data systems
Build high-performance dimensional models
Create secure, multi-entity data frameworks
Collaboration
Partner with Data Partners on KPI definitions, business rules, and modeling needs
Work with engineering teams for integrations and ingestion patterns
Ensure analysts and downstream systems receive fast, accurate, consistent data
Leadership
Mentor and lead the Data Engineer(s)
Influence BI, analytics, and business leaders
Serve as the ultimate authority on data quality and architecture
Governance
Define and enforce data standards
Build automation that ensures accuracy, validity, and consistency
Own documentation, lineage, and architectural direction
Innovation
Architect systems for real-time analytics and operational intelligence
Enable high-value analytics use cases (fraud detection, segmentation, forecasting)
Create reusable frameworks that accelerate analytics company-wide
IDEAL EXPERIENCE & SKILLS
Technical Mastery
10+ years in enterprise data architecture, engineering, or full-stack software development
Deep expertise in C#/.NET, Python, SQL, APIs, and cloud datawarehousing
Proven success architecting centralized data systems
Experience designing multi-entity or multi-tenant architecture
Strong background in modeling, ingestion, orchestration, and optimization
Architectural Depth
Understanding of distributed systems and scalable warehousing
Experience selecting storage strategies, compute sizing, and indexing techniques
Ability to evaluate and improve performance at every layer of the stack
Personal Attributes
Extreme ownership - you hold yourself accountable for the entire data ecosystem
Extremely confident in your work and an excellent team member
Competitive nature that drives you to create “the best” engineering solutions
High standards of accuracy and engineering craftsmanship
Curious, innovative, and excited to build from scratch
Thrives in fast-paced environments with complex challenges
REQUIREMENTS
Can pass a drug test and criminal background check.
Are legally eligible to work in the United States.
As a Tidal Wave Auto Spa Team Member, you will enjoy our Benefits Program to help secure your financial future and preserve your health and well-being, including:
PTO is based on the company's PTO policy.
Eligibility for health, dental, and vision coverage subject to 30 day waiting period.
Eligibility for 401(K), subject to plan terms.
Eligibility for benefits such as life insurance, short- and long-term disability, hospital indemnity, critical illness, and accidental, subject to 30 day waiting period.
Company-paid holidays.
**Must enroll in New Hire Benefits within 30 days of the date of hire for coverage to take effect.
The equal employment opportunity policy of Tidal Wave Auto Spa provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. Tidal Wave Auto Spa hires and promotes individuals solely on the basis of their qualifications for the job to be filled.
$97k-132k yearly est. Auto-Apply 8d ago
SME Data Warehouse Specialist/Reporting Specialist -- Data Bricks
Teracore 4.2
Remote data warehousing specialist job
US Citizenship Required / Background Investigation required to attain favorable entry on duty (EOD)
Currently remote opportunity.
Teracore is a Service Disabled Veteran Owned Small Business (SDVOSB) classified management consulting and information technology services firm. We are committed to creating and maintaining a corporate environment and culture that promotes long-term employment. Diverse talents help us to achieve the missions and objectives of our customers. We hope we can partner together to achieve those goals.
Project Background:
Provide the client with enterprise Business Intelligence reporting. We utilize data from the Oracle system combined with data from other applications, examples Travel Management / Direct Access (employee information), and files / crosswalks to create dashboards, reports, and data extracts.
Position Description:
This role will provide use of the Data Bricks enterprise Business Intelligence reporting tool. This will include hands-on functional development of dashboards, visualizations, reports, and extracts within the Data Bricks tool encompassing the complete Report Development Life Cycle (RDLC) of requirements, design, development, testing, implementation, and operations & maintenance.
Tasks/Responsibilities:
Create dashboards, visualizations, report, and data extracts using Data Bricks
Report Development Life Cycle (RDLC)
Working individually and as part of the team with the client staff to gather and finalize requirements
Working individually and as part of the team in dashboards, visualizations, reports and data extracts designs
Working individually and as part of the team in dashboards, visualizations, reports and data extracts development
Working individually and as part of the team in dashboards, visualizations, reports and data extracts testing and validation
Working individually and as part of the team in dashboards, visualizations, reports and data extracts operations & maintenance
Create Functional Design Documents (FDDs)
Create Technical Design Documents (TDDs)
Lead and assist in providing training and demonstrations
Support user base
Review / Update documentation
Internal team cross-review
Status tracking
Required Skills:
US Citizenship Required / Background Investigation required to attain favorable entry on duty (EOD)
BS Degree
2-5 years of experience (within the last 7 years) working Federal Financials within a Federal Agency
Data Bricks experience (must have)
Experience with big data
Excellent writing skills - ability to write professional quality documents
Excellent oral communications skills and the ability to make a positive contribution in meetings
Desired Skills:
Data Bricks coding - including Artificial Intelligence
Experience of performing reporting tasks for full-lifecycle Federal reporting project that is of similar size and scope
At Teracore, we support, depend and thrive on differences for the benefit of our associates and customers. Teracore is an equal opportunity employer. Employment decisions are based solely on a person's merit and professional qualifications directly related to job competence.
$73k-95k yearly est. Auto-Apply 7d ago
Data Migration Specialist
Intralinks 4.7
Remote data warehousing specialist job
As a leading financial services and healthcare technology company based on revenue, SS&C is headquartered in Windsor, Connecticut, and has 27,000+ employees in 35 countries. Some 20,000 financial services and healthcare organizations, from the world's largest companies to small and mid-market firms, rely on SS&C for expertise, scale, and technology.
Job Description
Data Migration Specialist
Locations: Remote
Get To Know Us:
The Intralinks Alts Services team is the strategic growth lever for the company. By enabling Intralinks both existing and new to upgrade to the latest Intralinks products, you will be the tip of the spear for the companies' growth in 2026 and beyond. In this role you will be responsible for leading, directing, and providing delivery of Intralinks data projects from a variety of sources. You will act as the primary point of contact in dealing with customer historical data. You will help retrieve their historical data, transform it, and help review it with them prior to their transition into the Intralinks ecosystem.
Why You Will Love It Here!
Flexibility: Hybrid Work Model and Business Casual Dress Code, including jeans
Your Future: 401k Matching Program, Professional Development Reimbursement
Work/Life Balance: Flexible Personal/Vacation Time Off, Sick Leave, Paid Holidays
Your Wellbeing: Medical, Dental, Vision, Employee Assistance Program, Parental Leave
Wide Ranging Perspectives: Committed to Celebrating the Variety of Backgrounds, Talents and Experiences of Our Employees
Training: Hands-On, Team-Customized, including SS&C University
Extra Perks: Discounts on fitness clubs, travel and more!
What You Will Get To Do:
Work with customer subject matter experts and Intralinks project team to identify, define, collate, document, and communicate data migration requirements
Conduct deep dive data analysis of the customer current state to validate customer requirements and define the scope of the migration
Strategize and plan the entire legacy system to new Intralinks product migration considering risks, timelines, and potential impacts
Work with the customer to map legacy data to new Intralinks product.
Analyze and cleanse data where necessary
Oversee the direct migration of data, which may require unexpected adjustments to the process and schedule
Provide regular status updates to customer and Intralinks migration teams
Oversee the quality control process to ensure all data has been migrated and accounted for
Document everything from the strategies used to the exact migration processes put in place-including documenting any fixes or adjustments made
Report any issues encountered to Intralinks support
Conduct regular meetings with the product management team to prioritize and resolve issues that are critical to the success of the migration process
Develop best practices, processes, and standards to continuously improve the Intralinks data migration process
Ensure compliance with regulatory requirements and guidelines for all migrated data
What You Will Bring:
Bachelor's degree in information management systems, computer science, or related field, or 3 years of related work experience
Relevant experience in either software implementation or data migration
Exceptional attention to detail in data
Strong data skills - analysis, transformation, validation
Ability to maintain data integrity and evaluate logical cohesion during complex data transformations
Strong Excel skills (XLookups, Pivots, Data Sources, Queries)
Working knowledge of Python scripting - setting up environments, modifying, and testing code
Familiarity with operation of SQL databases and query structure
Experience working with clients as a technical resource and communicating difficult concepts
Experience working with clients to keep projects focused, on track, and on time
Thank you for your interest in SS&C! If applicable, to further explore this opportunity, please apply directly with us through our Careers page on our corporate website: ************************
#LI-Intralinks
#LI-MB3
#CA-MB
Unless explicitly requested or approached by SS&C Technologies, Inc. or any of its affiliated companies, the company will not accept unsolicited resumes from headhunters, recruitment agencies, or fee-based recruitment services.
SS&C offers excellent benefits including health, dental, 401k plan, tuition and professional development reimbursement plan.
SS&C Technologies is an Equal Employment Opportunity employer and does not discriminate against any applicant for employment or employee on the basis of race, color, religious creed, gender, age, marital status, sexual orientation, national origin, disability, veteran status or any other classification protected by applicable discrimination laws.
Salary is determined by various factors including, but not limited to, relevant work experience, job related knowledge, skills, abilities, business needs, and geographic regions.NY: Salary range for the position: 100000 USD to 110000 USD.
$76k-95k yearly est. Auto-Apply 36d ago
AI & Data Strategy Architect
Phdata 4.3
Remote data warehousing specialist job
Join ph Data, a dynamic and innovative leader in the modern data stack. We partner with major cloud data platforms like Snowflake, AWS, Azure, GCP, Fivetran, Pinecone, Glean, and dbt to deliver cutting-edge services and solutions. We're committed to helping global enterprises overcome their toughest data challenges.
ph Data is a remote-first global company with employees based in the United States, Latin America, and India. We celebrate the culture of each of our team members and foster a community of technological curiosity, ownership, and trust. Even though we're growing extremely fast, we maintain a casual, exciting work environment. We hire top performers and allow you the autonomy to deliver results.
6x Snowflake Partner of the Year (2020, 2021, 2022, 2023, 2024, 2025)
Fivetran, dbt, Atlation, and AWS Partner of the Year
#1 Partner in Snowflake Advanced Certifications
600+ Expert Cloud Certifications (Sigma, AWS, Azure, Dataiku, etc)
Recognized as an award-winning workplace in the US, India, and LATAM
Role: AI & Data Strategy Architect
Target Levels: Senior Consultant or Lead
Our AI & Data Strategy Architects help clients use data and AI to drive business transformation and competitive advantage. You'll align strategy and architecture, design implementation roadmaps, and guide organizations through adopting modern data platforms, machine learning solutions, and responsible AI practices.
Core Responsibilities
Advise clients on AI & data strategy-linking business goals to concrete data/AI capabilities, use cases, and value.
Design implementation roadmaps and future-state operating models (roles, processes, governance, and platform direction).
Make architecture and technology recommendations (cloud, data platforms, integration, data quality, governance, privacy, security, regulatory) clearly tied to business outcomes.
Facilitate discovery and “art of the possible” sessions with business and technical stakeholders, including senior executives.
Communicate complex technical and strategic concepts in clear, executive-ready language and partner with cross-functional teams to embed strategy into delivery.
Own key workstreams within AI & data strategy engagements (e.g., use case discovery, roadmap design, operating model, governance approach).
Perform analysis and synthesis (interviews, research, frameworks) and turn insights into clear recommendations and client-ready deliverables.
Support workshop design and facilitation and mentor junior team members on consulting basics and communication.
At the Lead level, you also:
Lead full AI & data strategy engagements or major programs, often acting as the day-to-day Engagement Lead for focused strategy projects.
Own stakeholder alignment-including C‑suite-and translate goals into scoped plans, workstreams, milestones, and resourcing.
Guide technology rationalization and operating model decisions, weighing TCO, risk, and regulatory needs, and ensuring alignment across business and technical leaders.
Lead and coach multi-disciplinary teams (business analysts, architects, data engineers, change leaders) and help shape offerings and pursuits for new opportunities.
Qualifications
Experience as a hands-on AI & data strategy or architecture leader (strategy engagements, roadmaps, or major workstreams).
Consulting experience with external clients, managing multiple priorities and stakeholders.
Proven experience with ownership of complex workstreams.
Strong understanding of modern cloud/data architectures and AI capabilities, and how to apply them to increase revenue, improve customer experience, or enable new products.
Experience engaging senior stakeholders (up to C‑suite) and using strategy frameworks (e.g., opportunity prioritization, SWOT, multi‑year roadmaps, business cases/ROI).
Ability to make technical recommendations tied to business outcomes across cloud, data platforms, integration, data quality, data governance, privacy/security, and regulatory needs (e.g., GDPR, CCPA).
Proven facilitation and communication skills-workshops, “art of the possible” sessions, and executive-ready presentations.
Track record of collaboration and ownership across client stakeholders, partners, and global cross-functional teams.
Clear point of view on where data and AI are heading and how organizations can capture that value.
Willingness to travel as required by clients.
Additional expectations at the Lead level:
2+ years of consulting leadership experience, repeatedly leading full AI & data strategy engagements or multi-workstream programs.
Proven ability to coach and mentor others, communicate concepts in clear business language, and drive change and adoption.
ph Data celebrates diversity and is committed to creating an inclusive environment for all employees. Our approach helps us to build a winning team that represents a variety of backgrounds, perspectives, and abilities. So, regardless of how your diversity expresses itself, you can find a home here at ph Data. We are proud to be an equal opportunity employer. We prohibit discrimination and harassment of any kind based on race, color, religion, national origin, sex (including pregnancy), sexual orientation, gender identity, gender expression, age, veteran status, genetic information, disability, or other applicable legally protected characteristics. If you would like to request an accommodation due to a disability, please contact us at People Operations.
$88k-128k yearly est. Auto-Apply 16d ago
Sr Data Warehouse Lakehouse Developer
Lumen 3.4
Data warehousing specialist job in Columbus, OH
Lumen connects the world. We are igniting business growth by connecting people, data and applications - quickly, securely, and effortlessly. Together, we are building a culture and company from the people up - committed to teamwork, trust and transparency. People power progress.
We're looking for top-tier talent and offer the flexibility you need to thrive and deliver lasting impact. Join us as we digitally connect the world and shape the future.
**The Role**
We are seeking a Senior DataWarehouse/Lakehouse Developer to design, build, and optimize enterprise data solutions. This role combines advanced development expertise with strong analytical skills to translate business requirements into scalable, high-performance data systems. You will work closely with architects, product owners, and scrum teams, provide technical leadership, and ensure best practices in data engineering and testing.
**Location**
The position is a Work-From-Home available from any US-based location. You must be a US Citizen or Permanent Resident/Green Card for consideration.
**The Main Responsibilities**
**Design & Development**
+ Develop and maintain ETL/ELT processes for DataWarehouse and Lakehouse environments.
+ Create and optimize complex SQL queries, stored procedures, and data transformations.
+ Build and enhance source-to-target mapping documents.
+ Assist with UAT build and data loading for User Acceptance Testing.
+ Estimate levels of effort (LOEs) for analysis, design, development, and testing tasks.
**Technical Leadership**
+ Provide technical leadership and mentorship to team members.
+ Collaborate with architects, system engineers, and product owners to understand and detail business/system requirements and logical/physical data models.
+ Participate in and consult on integrated application and regression testing.
+ Conduct training sessions for system operators, programmers, and end users.
**Analytical Expertise**
+ Analyze programming requests to ensure seamless integration with current applications.
+ Perform data analysis and mapping to ensure accuracy and consistency.
+ Generate test plans and test cases for quality assurance.
+ Research and evaluate problems, recommend solutions, and implement decisions.
**Continuous Improvement**
+ Monitor and optimize data pipelines for performance and reliability.
+ Stay current with emerging technologies and recommend improvements to architecture and processes.
+ Adapt to changing priorities and aggressive project timelines while managing multiple complex projects.
**What We Look For in a Candidate**
**Technical Skills**
+ Proficiency in SQL and at least one programming language (Python, Java, Scala).
+ Experience with ETL tools (Informatica, Kafka) and Lakehouse technologies (Azure Data Factory, PySpark).
+ Familiarity with databases (Databricks, Oracle, SQL Server).
+ Knowledge of modeling tools (Visio, ERwin, UML) and data analysis tools (TOAD, Oracle SQL Developer, DBeaver).
+ Strong understanding of datawarehousing concepts and Lakehouse architecture.
**Analytical & Problem-Solving**
+ Ability to translate business requirements into technical solutions.
+ Strong troubleshooting and performance tuning skills.
+ Demonstrated organizational, oral, and written communication skills.
**Experience**
+ 6+ years of experience with a Bachelor's degree OR 4+ years with a Master's degree.
+ Proven ability to lead technical teams and manage projects.
+ Experience in applications development and systems analysis.
**Preferred Qualifications**
+ Project management experience.
+ Familiarity with CI/CD pipelines and version control (Git).
+ Exposure to big data frameworks (Spark, Hadoop) and cloud ecosystems (Azure, AWS, GCP).
**Compensation**
This information reflects the anticipated base salary range for this position based on current national data. Minimums and maximums may vary based on location. Individual pay is based on skills, experience and other relevant factors.
Location Based Pay Ranges
$82,969 - $110,625 in these states: AL AR AZ FL GA IA ID IN KS KY LA ME MO MS MT ND NE NM OH OK PA SC SD TN UT VT WI WV WY
$87,117 - $116,156 in these states: CO HI MI MN NC NH NV OR RI
$91,266 - $121,688 in these states: AK CA CT DC DE IL MA MD NJ NY TX VA WA
Lumen offers a comprehensive package featuring a broad range of Health, Life, Voluntary Lifestyle benefits and other perks that enhance your physical, mental, emotional and financial wellbeing. We're able to answer any additional questions you may have about our bonus structure (short-term incentives, long-term incentives and/or sales compensation) as you move through the selection process.
Learn more about Lumen's:
Benefits (****************************************************
Bonus Structure
\#LI-Remote
\#LI-PS
Requisition #: 340407
**Background Screening**
If you are selected for a position, there will be a background screen, which may include checks for criminal records and/or motor vehicle reports and/or drug screening, depending on the position requirements. For more information on these checks, please refer to the Post Offer section of our FAQ page (************************************* . Job-related concerns identified during the background screening may disqualify you from the new position or your current role. Background results will be evaluated on a case-by-case basis.
Pursuant to the San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.
**Equal Employment Opportunities**
We are committed to providing equal employment opportunities to all persons regardless of race, color, ancestry, citizenship, national origin, religion, veteran status, disability, genetic characteristic or information, age, gender, sexual orientation, gender identity, gender expression, marital status, family status, pregnancy, or other legally protected status (collectively, "protected statuses"). We do not tolerate unlawful discrimination in any employment decisions, including recruiting, hiring, compensation, promotion, benefits, discipline, termination, job assignments or training.
**Disclaimer**
The job responsibilities described above indicate the general nature and level of work performed by employees within this classification. It is not intended to include a comprehensive inventory of all duties and responsibilities for this job. Job duties and responsibilities are subject to change based on evolving business needs and conditions.
In any materials you submit, you may redact or remove age-identifying information such as age, date of birth, or dates of school attendance or graduation. You will not be penalized for redacting or removing this information.
Please be advised that Lumen does not require any form of payment from job applicants during the recruitment process. All legitimate job openings will be posted on our official website or communicated through official company email addresses. If you encounter any job offers that request payment in exchange for employment at Lumen, they are not for employment with us, but may relate to another company with a similar name.
$91.3k-121.7k yearly 48d ago
Data Migration Specialist
Buildout 3.8
Remote data warehousing specialist job
Buildout is the AI deal engine for CRE brokerages, automating every step from first contact to commission. While brokers focus on relationships and winning listings, Buildout handles the workflows behind the scenes, turning manual processes into intelligent, scalable systems. Trusted by over 50,000 brokers, Buildout powers more profitable deals from lead to close. Learn more at *****************
The Opportunity
We're hiring a Data Migration Specialist who will be the go‑to data expert to turn a customer's complex export into clean and usable data in Buildout. You'll partner with customers at pivotal moments across the customer journey-from pre‑sales scoping calls, to onboarding implementations, to the occasional post‑launch data request-ensuring customers start strong and stay successful. Your work translates messy spreadsheets into meaningful records, shortens time‑to‑value, unblocks implementations, and prevents churn.
This role is a unique blend of customer consultation and technical execution. You'll spend time working directly with customers to guide them through their data journey, while also independently performing the data migrations that ensure their success.
How You'll Contribute
You will play an active role in your customers' onboarding journey by attending kick-off calls and ongoing check-ins, acting as the SME on data quality, and collaborating with internal teams to set customers up for success
You will facilitate the movement of Customer data from their own home-grown spreadsheets and other CRMs/systems into Buildout
Clean-up and manipulate customer data so it is ready for import
Schedule calls with customers as needed to review and clarify data
Import the data into the Buildout system
QA the data that was imported & deliver to customer
You will help to define the project scope, goals and deliverables to ensure both the Customer and internal teams are aligned
You will collaborate with other departments on behalf of your Customer to resolve issues and coordinate requests as needed
You will monitor your Customers' progress to ensure their project stays on track and escalate potential blockers internally
What Makes a Great Candidate
You have experience migrating and/or importing data into a CRM (Salesforce experience preferred)
You are skilled in data manipulation using tools like Microsoft Excel, Google Sheets or .CSV files
You are passionate about working with customers directly and ensuring their success
You have clear, customer‑friendly communication and are able to explain technical topics simply and set expectations with confidence.
You have strong time management and organization skills to manage parallel customer requests and timelines
You have the ability to identify potential roadblocks and take initiative to swiftly resolve
Nice to have:
Experience working in a B2B SaaS organization
Experience with Atlassian (Jira & Confluence), and screen sharing tools
Experience in Commercial Real Estate (CRE) industry
We know there are great candidates who won't check all of these boxes, and we also know you might bring important skills that we haven't considered. If that's you, don't hesitate to apply and tell us about yourself.
Location: This is a fully remote role open across most of the US.
Compensation: The compensation range for this position is $65,000 - $75,000.
Reporting To: Jason Loeffler, our Senior Manager of Implementation
Perks & Benefits
This program includes:
Impactful insurance and benefit options, including 2 medical plans to choose from, 100% coverage of employee dental and vision insurance premiums, HSA seed, company-paid STD, LTD, life insurance, and telemedicine, and a wellness benefit of $400/year.
Policies that support a healthy work/life harmony, including Flexible PTO, 14 paid company holidays, paid parental leave, and give back days
401(k) with 4% company match and immediate vesting
A fully remote work culture with a monthly remote work reimbursement ($600/year) to support our distributed team and an annual, in-person company kickoff
Challenging problems to solve with a committed and supportive team who are invested in your growth and development
A wonderfully quirky culture where you're encouraged to bring your whole self to work
Buildout is proud to be an Equal Opportunity Employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, marital status, order of protection status, citizenship status, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law.
If you need a reasonable accommodation for any part of the employment process, please contact us by email at accommodations@buildout.com and let us know the nature of your request and your contact information and we will consider your request.
Below, you will be asked to complete identity information for the Equal Employment Opportunity Commission (EEOC). It is required by law that we ask these questions using the format provided by the EEOC. However, we want you to know that at Buildout, we understand that gender is not binary and welcome people of all identities.
For more information about our privacy practices please visit our Privacy Policy. By submitting your application, California residents consent to Buildout processing your personal information for the purpose of assessing your candidacy for this position in accordance of our Privacy Notice for Prospective California Employees.
$65k-75k yearly Auto-Apply 41d ago
M-4/1 - 4939 - Full Stack Engineer/Data Architect - Remote & Phoenix, AZ (LOCAL Candidates Only)
FHR 3.6
Remote data warehousing specialist job
** Position is hybrid. Mainly remote but will need to come into the office in Phoenix, AZ periodically for meetings. Local to AZ Candidates only - no relocation allowed. Candidate MUST be able to attend an in-person interview in Phoenix, AZ. **
Our direct client has an opening for a Full Stack Engineer/Data Architect # 4939. This position is for 6-12+ months, with option of extension, and will be worked hybrid - mainly remote but will need to come into the office in Phoenix, AZ periodically for meetings.
If you are interested, please submit the following:
YOUR CURRENT RESUME
YOUR HOURLY RATE
Below is the job description - Resumes due ASAP -
Description:
The client is developing a centralized portal to serve Arizonans as a user-friendly entry point for available health and human services. This single point of entry will uniquely identify each individual, making it easier for Arizona residents to access prioritized services provided by state health and human service agencies.
In support of this broad initiative, DHS is seeking assistance with the assessment, prioritization, and future state technical design of their most critical citizen-centric services.
Description: The client is seeking a Full Stack Engineer/Data Architect with knowledge in the design, governance and implementation of the organization's data and integrating master data across all business units and systems. This role will require experience in MDM solutions, data modeling, integration and governance practices. The scope of work entails an understanding of the current MDM architecture and an assessment of planned MDM functionality, integration needs and desired architecture.
Technical Experience:
5+ years experience in data architecture, MDM and data governance(e.g. Informatica, TAMR, SQL)
5+ years experience in application & integration services(e.g. Salesforce, .NET framework, ASP.NET, WebAPI, REST APIs, SOAP)
5+ years experience in programming and scripting languages(e.g. Python)
5+ years data management frameworks, data integration and ETL processes(e.g. Matillion, AWS
Glue)
3+ years experience in datawarehouse , data lakes and analytics platforms(e.g. Snowflake, Databricks)
Experience in cloud platforms(e.g. Azure, AWS, Google Cloud)
By replying to this job advertisement, I agree I want to receive additional job advertisements from FHR, including email, phone and mail to the contact information I am submitting. I consent to Focused HR Solutions, its affiliates, third parties and partners processing my personal data for these purposes and as described in the Privacy Policy. I understand that I can withdraw my consent at anytime.
--
$89k-127k yearly est. 26d ago
Data Architect
Ayr Global It Solutions 3.4
Data warehousing specialist job in Columbus, OH
AYR Global IT Solutions is a national staffing firm focused on cloud, cyber security, web application services, ERP, and BI implementations by providing proven and experienced consultants to our clients. Our competitive, transparent pricing model and industry experience make us a top choice of Global System Integrators and enterprise customers with federal and commercial projects supported nationwide.
Job Description
Data Architect
COLUMBUS, OHIO
Duration: 12+Months
F2F Interview will be conducted during the week of Aug. 14th
Local candidates only and onsite interviews are REQUIRED
Qualifications
Mandatory Requirements
4 year college degree or equivalent technical study.
Advanced SQL Skills in MS SQL Server required.
ETL or Data transformation experience.
Data Extraction tools.
MS Office Products.
Additional Information
If anyone might be interest, please share your resume at
***************************
or you can directly contact me at
************
$87k-122k yearly est. Easy Apply 2d ago
Data Architect or Data Modeller
Devcare Solutions 4.1
Data warehousing specialist job in Columbus, OH
As a support to the Data Science and Decision Analytics teams within the Enterprise Data & Analytics Organization, Data Architects will produce multi-purpose pre-prepared modeling data structures. These structures will allow the Data Science team to construct lead-generation models in an expedited fashion along with confirmation that the data lineage and definitions are sound. Data Architects will also help facilitate the delivery of leads as output against these pre-prepared data structures. Additionally, will support data strategy and requirements building on behalf of the Enterprise Data & Analytics function.
$84k-121k yearly est. 60d+ ago
Big Data Developer
Sai Software Solutions LLC 4.1
Data warehousing specialist job in Columbus, OH
Required Skills/Experience: * Build distributed, scalable, and reliable data pipelines that ingest and process data at scale and in real-time * Create metrics and apply business logic using Spark, Scala, R, Python, and/or Java * Model, design, develop, code, test, debug, document and deploy application to production through standard processes
* Harmonize, transform, and move data from a raw format to consumable, curated views
* Analyze, design, develop, and test applications
* Contribute to the maturation of Data Engineering practices, which may include providing training and mentoring to others
* Live the State Auto cultural values with a strong sense of teamwork
* Strong hands-on experience in Spark, Scala, R, Python, and/or Java
* Programming experience with the Hadoop ecosystem of applications and functional understanding of distributed data processing systems architecture (Data Lake / Big Data /Hadoop/ Spark / HIVE, etc).
* Amazon Big Data ecosystem (EMR, Kinesis, Aurora) experience
$71k-96k yearly est. 60d+ ago
Data Architect (Remote)
Francisco Partners 3.6
Remote data warehousing specialist job
First San Francisco Partners is a business advisory and enterprise information management (EIM) consultancy dedicated to helping companies leverage their data to improve strategic decision-making, reduce risk, create operational efficiencies and fuel unprecedented business success. Our services span data governance, data quality strategies, data management architecture, master data management strategy and implementation, analytics and big data.
Job Responsibilities and Duties
We have an immediate opening for a Data Architect to who will have a hands-on role with responsibility to develop data architecture and modeling strategies, design, implement and support data architecture deliverables for multiple data integration, data management, datawarehousing, business intelligence and analytics projects. They will also be able to deliver solid, extensible, highly-available data models and data environments that supports the current and future business and technical requirements.
Develops and maintains architectures for the high-level data environments of the enterprise, at the reference, conceptual, and logical levels, ensuring that these align to overall business strategy
Ensures that database and data storage technologies support the data management needs of the enterprise
Develops, communicates, supports, and monitors compliance with Data Modeling standards
Evaluates proposals for development projects to ensure they adhere to all data architecture standards
Develops and maintains standard patterns for data layers, data stores, and utility data management processes (e.g. data movement, data integration) for application across the enterprise
Assists development projects, either directly or indirectly by liaising with a solution architect, to ensure that good data architecture is implemented in these projects.
Evaluates currently implemented systems to determine their viability in terms of data architecture
Participates in the oversight of setting data standards, for reference data, data formats, and similar needs.
Identifies standard metadata for describing data assets.
Develops standards for the semantic needs of data, including different kinds of models (e.g. subject areas models, data classification schemes, standards for ontologies).
Ensures all documentation for data architecture is of high quality and properly curated
Skills and Qualifications:
Excellent communication skills, presentation and interpersonal skills are required.
Ability to communicate clearly with both business and technical resources.
A demonstrated track record of making a difference and adding value
Strong organizational skills. Able to multi-task
Ability to think creatively, highly-driven and self-motivated
Ability to work and adjust to changing deadlines
Ability to quickly adapt to changes, enhancements and new technologies
Able to perform in a fast paced, dynamic and innovative work environment and meet aggressive deadlines
Creative problem-solving skills.
Must be able to develop relationships across the organization, working cross functionally to get results
Ability to present complex information in a simplified fashion to facilitate understanding
Can effectively manipulate and analyze large amounts of data
The ability to understand data relationships, write and execute SQL queries
Proficient with MS Office products
Proficiency with SQL
Bachelor's degree in Bachelor's Degree in Business Administration, Computer Science, CIS or related field
3-5 years of experience with data projects
Addition qualifications:
Experience with data and enterprise modeling tools, such as Erwin.
Experience with ETL/Data Quality tools such as Informatica IDQ and Trillium
Technical expertise with analytical tools including SQL, SAS, SPSS, R, and Tableau preferable.
3+ years or more years of experience in design, development, modification and testing of Hadoop solutions.
Experience in Oozie, Hive, Pig, Impala, Sqoop, Flume, Hbase & Solr a plus.
Minimum 7-10 yrs. of experience in Oracle databases
5 or more years of experience in developing complex SQL queries using tools such as Oracle, MySQL.
Understanding of Pentaho or other ETL tool.
Experience with RedHat Enterprise Linux preferred.
Experience with designing, developing, and administering SQL Server databases
3+ years of experience with developing databases in an Agile framework with constantly changing technical requirements
3+ years of experience with designing, developing, and administrating a datawarehouse preferred.
3+ years of experience with designing, developing, and administrating Microsoft Access databases
3+ years of experience with t-SQL and writing stored procedures, functions, and triggers.
1+ years of experience with migrating a Microsoft Access database to a Microsoft SQL Server
1+ years of experience with connecting a Microsoft Access front end to a Microsoft SQL Server back end
Experience with NoSQL, including Mongo DB, Neo4j, Cassandra, or others.
Experience with at least one scripting language, including Python, Perl, or others
$130k-174k yearly est. 60d+ ago
Principal Data Architect
Egen 4.2
Remote data warehousing specialist job
Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. If this describes you, we want you on our team.
Want to learn more about life at Egen? Check out these resources in addition to the job description.
Meet EgenLife at EgenCulture and Values at EgenCareer Development at EgenBenefits at EgenResponsibilities:
Lead the end-to-end architecture, design, and implementation of scalable Data Lakehouse solutions on Google Cloud Platform (GCP) using BigQuery, GCS, BigLake, and Dataplex
Collaborate directly with customers to understand business goals, data challenges, and technical requirements; translate them into robust architectural blueprints and actionable plans
Design and implement data pipelines supporting both real-time and batch ingestion using modern orchestration and streaming frameworks
Establish and enforce best practices for data cataloging, metadata management, lineage, and data quality across multiple systems
Define and implement data security, access control, and governance models in compliance with enterprise and regulatory standards
Serve as the technical lead for project teams - mentoring engineers, reviewing solutions, and ensuring architectural consistency across deliverables
Balance strategic architecture discussions with hands-on solutioning, POCs, and deep dives into data pipelines or performance tuning
Partner with stakeholders, cloud architects, and delivery leads to drive solution adoption, scalability, and long-term maintainability
Represent the company as a trusted technical advisor in client engagements - clearly articulating trade-offs, best practices, and recommendations
Qualifications:
8-10 years of progressive experience in Software Engineering and Data Platform development, with 5+ years architecting data platforms on GCP and/or Databricks
Proven hands-on experience designing and deploying Data Lakehouse platforms with data products and medallion architectures
Strong understanding of data ingestion patterns (real-time and batch), ETL/ELT pipeline design, and data orchestration using tools such as Airflow, Pub/Sub, or similar frameworks
Expertise in data modeling, storage optimization, partitioning, and performance tuning for large-scale analytical workloads
Experience implementing data governance, security, and cataloging solutions (Dataplex, Data Catalog, IAM, or equivalent)
Excellent communication and presentation skills - able to confidently engage with technical and non-technical stakeholders and guide clients through solution decisions
Demonstrated ability to lead by example in mixed teams of engineers, analysts, and architects, balancing architectural vision with hands-on delivery
Nice to have: Experience with Databricks (Delta Lake, Unity Catalog) and hybrid GCP-Databricksdata architectures
Strong problem-solving mindset, curiosity to explore new technologies, and ability to “zoom out” for architecture discussions and “zoom in” for code-level troubleshooting
Compensation & Benefits:
This role is eligible for our competitive salary and comprehensive benefits package to support your well-being:- Comprehensive Health Insurance- Paid Leave (Vacation/PTO)- Paid Holidays- Sick Leave- Parental Leave - Bereavement Leave- 401 (k) Employer Match- Employee Referral Bonuses
Check out our complete list of benefits here - >********************************
Important: All roles are subject to standard hiring verification practices, which may include background checks, employment verification, and other relevant checks.
EEO and Accommodations:
Egen is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Egen will also consider qualified applications with criminal histories, consistent with legal requirements. Egen welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
$82k-112k yearly est. Auto-Apply 60d+ ago
Domain Data Architect - Finance Data Mart
Jpmorgan Chase & Co 4.8
Data warehousing specialist job in Columbus, OH
JobID: 210669784 JobSchedule: Full time JobShift: : As part of our team, you'll help modernize our data environment and unlock new opportunities for career growth and skill development.
As a Domain Data Architect in the Finance Data Mart team, you will partner with Technology and Product teams to design and deliver data domains using Databricks. You will enable the Finance function to access and analyze essential data, supporting business needs across banking, wealth management, credit cards, auto lending, and home lending. You will collaborate closely with stakeholders to make data discoverable and actionable, fostering a culture of informed decision-making and continuous improvement.
You will work within a dynamic, cross-functional team, contributing to the transformation of our data architecture and integration processes. Your expertise will help ensure our data solutions remain robust, scalable, and aligned with industry best practices, supporting the analytical and reporting needs of the Finance organization.
Job Responsibilities
* Implement and optimize the Finance Data Mart using Databricks and ThoughtSpot for analytics and reporting
* Design efficient data mart schemas consolidating key data categories from multiple source systems
* Collaborate with data engineers, analysts, and technical specialists to gather and analyze business requirements
* Develop and optimize ETL processes and data pipelines using Databricks
* Partner with Area Product teams and stakeholders to design reporting and analytics solutions
* Apply design-led thinking to make data discoverable and accessible for analytical needs
* Maintain documentation on data mart architecture, data models, ETL processes, and governance policies
* Stay current on industry best practices and emerging trends in data management
* Ensure alignment of data solutions with organizational objectives
* Support the Finance function across diverse business areas
* Foster a collaborative and innovative team environment
Required Qualifications, Capabilities, and Skills
* Minimum seven years of experience in data architecture, datawarehousing, and data integration in financial services
* Bachelor's degree in Computer Science, Information Systems, or related discipline
* Expertise in designing scalable data mart architectures, including star and snowflake schemas
* Strong knowledge of data management, data lineage, and data dictionaries
* Proven track record in managing and delivering complex data projects
* Strong written and verbal communication skills
* Proficiency in SQL, Data Modeling, and ERWIN
Preferred Qualifications, Capabilities, and Skills
* Strong knowledge of Amazon Web Services; AWS Certifications preferred
* Experience with Databricks, Snowflake, or other cloud datawarehouses
* Experience with market-leading data catalog systems
* Experience with ThoughtSpot, Sigma, Tableau, Alteryx, or Essbase a plus
$101k-126k yearly est. Auto-Apply 60d+ ago
Immediate Interview for Data Architect (SQL and ETL Exp)
360 It Professionals 3.6
Data warehousing specialist job in Columbus, OH
360 IT Professionals and we are Staffing Specialist working directly with all US States and Local and Commercial clients. We are known for our IT Services, Mobile development, Web development and Cloud computing and working with clients to deliver high-performance results.
Job Description
• Develop data architecture models.
• Develop data flow models.
• Develop business process / workflow models.
• Design, develop, and implement database tables/views.
• Create SQL queries to support application interfaces, data transfers (ETLs), and data extracts.
• Verify accuracy and completeness of application data validation procedures.
Additional Information
Thanks & Regards
Preeti Joshi
510-254-3300 Ext 142
preeti@)360itpro.com
$91k-118k yearly est. 60d+ ago
Salesforce Data 360 Architect
Slalom 4.6
Data warehousing specialist job in Columbus, OH
Who You'll Work With In our Salesforce business, we help our clients bring the most impactful customer experiences to life and we do that in a way that makes our clients the hero of their transformation story. We are passionate about and dedicated to building a diverse and inclusive team, recognizing that diverse team members who are celebrated for bringing their authentic selves to their work build solutions that reach more diverse populations in innovative and impactful ways. Our team is comprised of customer strategy experts, Salesforce-certified experts across all Salesforce capabilities, industry experts, organizational and cultural change consultants, and project delivery leaders. As the 3rd largest Salesforce partner globally and in North America, we are committed to growing and developing our Salesforce talent, offering continued growth opportunities, and exposing our people to meaningful work that aligns to their personal and professional goals.
We're looking for individuals who have experience implementing Salesforce Data Cloud or similar platforms and are passionate about customer data. The ideal candidate has a desire for continuous professional growth and can deliver complex, end-to-end Data Cloud implementations from strategy and design, through to data ingestion, segment creation, and activation; all while working alongside both our clients and other delivery disciplines. Our Global Salesforce team is looking to add a passionate Principal or Senior Principal to take on the role of Data Cloud Architect within our Salesforce practice.
What You'll Do:
Responsible for business requirements gathering, architecture design, data ingestion and modeling, identity resolution setup, calculated insight configuration, segment creation and activation, end-user training, and support procedures
Lead technical conversations with both business and technical client teams; translate those outcomes into well-architected solutions that best utilize Salesforce Data Cloud and the wider Salesforce ecosystem
Ability to direct technical teams, both internal and client-side
Provide subject matter expertise as warranted via customer needs and business demands
Build lasting relationships with key client stakeholders and sponsors
Collaborate with digital specialists across disciplines to innovate and build premier solutions
Participate in compiling industry research, thought leadership and proposal materials for business development activities
Experience with scoping client work
Experience with hyperscale data platforms (ex: Snowflake), robust database modeling and data governance is a plus.
What You'll Bring:
Have been part of at least one Salesforce Data Cloud implementation
Familiarity with Salesforce's technical architecture: APIs, Standard and Custom Objects, APEX. Proficient with ANSI SQL and supported functions in Salesforce Data Cloud
Strong proficiency toward presenting complex business and technical concepts using visualization aids
Ability to conceptualize and craft sophisticated wireframes, workflows, and diagrams
Strong understanding of data management concepts, including data quality, data distribution, data modeling and data governance
Detailed understanding of the fundamentals of digital marketing and complementary Salesforce products that organizations may use to run their business. Experience defining strategy, developing requirements, and implementing practical business solutions.
Experience in delivering projects using Agile-based methodologies
Salesforce Data Cloud certification preferred
Additional Salesforce certifications like Administrator are a plus
Strong interpersonal skills
Bachelor's degree in a related field preferred, but not required
Open to travel (up to 50%)
About Us
Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance.
Slalom is committed to fair and equitable compensation practices. For this role, we are hiring at the following levels and salary ranges:
East Bay, San Francisco, Silicon Valley:
Principal: $184,000-$225,000
San Diego, Los Angeles, Orange County, Seattle, Boston, Houston, New Jersey, New York City, Washington DC, Westchester:
Principal: $169,000-$206,000
All other locations:
Principal: $155,000-$189,000
In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time.
We are committed to pay transparency and compliance with applicable laws. If you have questions or concerns about the pay range or other compensation information in this posting, please contact us at: ********************.
We will accept applications until January 30, 2025 or until the position is filled.
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team or contact ****************************** if you require accommodations during the interview process.