Data Engineer
Data engineer job at Codeworks
Job DescriptionCodeworks is an IT Services firm headquartered in SE Wisconsin, known for our strong commitment to quality and for our direct client relationships. Our Client, is seeking a skilled Data Engineer to join our remote team. This role is ideal for professionals experienced in designing and implementing data ingestion and pipeline processes on the Google Cloud Platform (GCP). You will work closely with both technical and business teams to transform requirements into efficient, scalable data workflows using industry-standard practices and tools such as BigQuery, Pub/Sub, Cloud Composer, and Dataform. This is a great opportunity for someone looking to contribute to impactful data solutions in a collaborative and innovative environment.
This is a 100% Remote, Contract position.
Responsibilities
Design, develop, and implement data ingestion and pipeline workflows based on business and technical requirements.
Independently and collaboratively contribute to team efforts, keeping stakeholders informed throughout.
Translate business requirements into detailed technical specifications.
Develop, configure, test, debug, and document scalable data solutions.
Write complex SQL queries to extract, analyze, and troubleshoot data issues.
Execute unit, system, integration, and user acceptance testing with thorough documentation.
Participate in project planning, potentially taking ownership of projects with limited scope.
Estimate development time for budgeting and track progress for project visibility.
Conduct peer reviews of designs, code, and configurations.
Provide ongoing technical support and ensure adherence to data security and development standards.
Participate in an on-call rotation for production support (every 4-6 weeks, depending on staffing).
Qualifications
Bachelor's Degree (4-Year) in Computer Science, Engineering, or a related field.
3+ years of professional experience in data engineering and hands on with the following:
Google Cloud Platform (GCP)
Google Pub/Sub
BigQuery
Google Dataform
Google Cloud Storage
Cloud Composer
SQL (Structured Query Language)
GitHub
Preferred/Good to Have:
Cloud Data Fusion
Experience with DBMS systems (SQL Server, DB2, Oracle)
RESTful API and SOAP Web Service integration
Experience with Jira and Confluence
Experience integrating with other Google Cloud resources and services
About Codeworks: Codeworks has over 25 years of experience serving Fortune 1000 companies in Wisconsin as well as our client's national locations. Our recruiting team excels at evaluating, advising, and connecting IT professionals with new opportunities that will satisfy their expectations regarding income and opportunity for growth. At Codeworks, we're committed to diversity, equity, and inclusion in our workforce and beyond. We believe in equal opportunities and value the unique perspectives that every individual brings to our team. Join us in creating an inclusive, innovative, and collaborative workplace where your talents can thrive.
Codeworks is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws.
Codeworks, LLC discloses that the anticipated hourly pay range for this position is between $35 and $40. This range is subject to change based on job-related factors, including client requirements where applicable.
All full time Codeworks employees are eligible to enroll in the company's medical, dental, vision, and life insurance plans. Additionally, employees can participate in Codeworks' 401(k) retirement plan.
#LI-KN1
Senior Software Engineer - C++ & Linux (Onsite in Milwaukee area)
Data engineer job at Codeworks
Job DescriptionCodeworks is a locally owned and operated IT Services firm in SE Wisconsin, known for our strong commitment to quality and for our direct client relationships. Our client located in Waukesha, WI is looking to bring aboard a C++ Software Engineer to their team on a 12-month contract basis, with strong likelihood of extension. This individual will develop, analyze and review specifications and produce software implementation aligned with the overall product software architecture and technology stack following Software Development Life Cycle process. This individual will work with various technologies including C++, Python, and Linux.Other responsibilities will include:
Be responsible for defining, developing, and evolving software in a fast paced and agile development environment using the latest software development technologies and infrastructure
Work with a cross-functional team of engineers, scientists and applications experts to translate high level application needs that demand new reconstruction capabilities into component-level requirements
Design and implement solutions to complex data management and distributed processing software problems in the reconstruction platform domain in accordance with established software development practices and processes.
Plan and perform integration activities at component, sub-system and system levels. Document designs and verification activities; perform component & subsystem level verifications, participate in system level verifications and validations as necessary.
Drive increased efficiency across the teams, eliminating duplication, leveraging product and technology reuse
Support process improvements which guide the development, sustaining & support activitie
Apply principles of SDLC and methodologies like Lean/Agile/XP, CI, Software and Product Security, Scalability, Documentation Practices, refactoring and Testing Techniques
Write code that meets standards and delivers desired functionality using the technology selected for the project
Understand performance parameters and assess application performance
Work on core data structures and algorithms and implement them using technology chosen
The ideal candidate will have a few years of professional experience the following skills:
Bachelor's Degree in Computer Science, Electrical Engineering or Computer Engineering with minimum years of experience 5+ years
Experience with Linux based programming on X86 systems.
Demonstrated proficiency in C++ programming and Object oriented programing concepts applied in a production software environment.
Working knowledge in configuration management tools such as GiT
Experience with enterprise database to store and retrieve large volumes of data efficiently
Demonstrated expertise with MATLAB or equivalent scientific modeling tools & packages is a plus
Experience with Parallel computing concepts and tools - MPI, OpenMP is preferred
Experience designing and architecting high performance systems.
Experience working with C++ and Python IDEs (Eclipse, CLion, Visual Studio, XCode, PyCharm)
Familiarity with Requirements management and troubleshooting for 5+ years
Demonstrate strong communication and collaboration skills in a global team setting.
For immediate consideration, qualified candidates should send their resumes. Attn: Laura Apply Here Candidates must be willing to work onsite in the Milwaukee, WI area. About Codeworks: Codeworks has over 25 years of experience serving Fortune 1000 companies in Wisconsin as well as our client's national locations. Our recruiting team excels at evaluating, advising, and connecting IT professionals with new opportunities that will satisfy their expectations regarding income and opportunity for growth. At Codeworks, we're committed to diversity, equity, and inclusion in our workforce and beyond. We believe in equal opportunities and value the unique perspectives that every individual brings to our team. Join us in creating an inclusive, innovative, and collaborative workplace where your talents can thrive.
Codeworks is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws.Codeworks, LLC discloses that the anticipated hourly pay range for this position is between $50/hr and $65/hr. This range is subject to change based on job-related factors, including client requirements where applicable.
All full time Codeworks employees are eligible to enroll in the company's medical, dental, vision, and life insurance plans. Additionally, employees can participate in Codeworks' 401(k) retirement plan.#Li-LB1
GCP Data Architect
Dearborn, MI jobs
Title: GCP Data Architect
Description: STG is a fast-growing Digital Transformation services company providing Fortune 500 companies with Digital Transformation, Mobility, Analytics and Cloud Integration services in both information technology and engineering product lines. STG has a 98% repeat business rate from existing clients and have achieved industry awards and recognition for our services.
Responsibilities:
Data Modeling and Design: Develop conceptual, logical, and physical data models for business intelligence, analytics, and reporting solutions. Transform requirements into scalable, flexible, and efficient data structures that can support advanced analytics.
Requirement Analysis: Collaborate with business analysts, stakeholders, and subject matter experts to gather and interpret requirements for new data initiatives. Translate business questions into data models that can answer these questions.
Data Integration: Work closely with data engineers to integrate data from multiple sources, ensuring consistency, accuracy, and reliability. Map data flows and document relationships between datasets.
Database Architecture: Design and optimize database schemas using the medallion architecture which includes relational, star schema and denormalized data sets for BI and ML data consumers.
Metadata Management: Team with the data governance team so detailed documentation on data definitions, data lineage, and data quality statistics are available to data consumers.
Data Quality Assurance: Establish master data management data modeling so the history of how customer, provider and other party data is consolidated into a single version of the truth.
Collaboration and Communication: Serve as a bridge between technical teams and business units, clearly communicating the value and limitations of various data sources and structures.
Governance and Compliance: Ensure that data models and processes adhere to regulatory standards and organizational policies regarding privacy, access, and security.
Experience Required:
Specialist Exp: 10+ yrs in IT; 7+ yrs as Data Architect
Power Builder
PostgreSQL
GCP
Big Query
GCP Data Architect position is based at our corporate office located in Dearborn, Michigan. A great opportunity to experience the corporate environment leading personal career growth.
Resume Submittal Instructions: Interested/qualified candidates should email their word formatted resumes to Ms. Shweta Huria at ********************** and/or contact at ************. In the subject line of the email please include: First and Last Name (GCP Data Architect).
For more information about STG, please visit us at **************
GCP Data Architect (only W2 Position - No C2C Accepted) 11.18.2025
Dearborn, MI jobs
- No C2C Accepted) 11.18.2025
Description: STG is a SEI CMMi Level 5 company with several Fortune 500 and State Government clients. STG has an opening for GCP Data Architect.
Please note that this project assignment is with our own direct clients. We do not go through any vendors. STG only does business with direct end clients. This is expected to be a long-term position. STG will provide immigration and permanent residency sponsorship assistance to those candidates who need it.
Job Description:
Employees in this job function are responsible for designing, building and maintaining reliable, efficient and scalable data architecture and data models that serve as a foundation for all data solutions. They also closely collaborate with senior leadership and IT teams to ensure alignment of data strategy with overall business goals.
Key Responsibilities:
Align data strategy to business goals to support a mix of business strategy, improved decision-making, operations efficiency and risk management
Ensure data assets are available, consumable and secure for end users across the enterprise - applications, platforms and infrastructure - within the confines of enterprise and security architecture
Design and build reliable, efficient and scalable data architecture to be used by the organization for all data solutions
Implement and maintain scalable architectural data patterns, solutions and tooling to support business strategy
Design, build, and launch shared data services and APIs to support and expose data-driven solutions in line with enterprise architecture standards
Research and optimize data architecture technologies to enhance and support enterprise technology and data strategy
Skills Required:
Power Builder, PostgreSQL, GCP, Big Query
Senior Specialist Exp.: 10+ years in IT; 7+ years in concentration
Must have experience presenting technical material to business users
Must be able to envision larger strategies and anticipate where possible synergies can be realized
Experience acting as the voice of the architecture/data model and defend its relevancy to ensure adherence to its principles and purpose
Data Modeling and Design: Develop conceptual, logical, and physical data models for business intelligence, analytics, and reporting solutions. Transform requirements into scalable, flexible, and efficient data structures that can support advanced analytics.
Requirement Analysis: Collaborate with business analysts, stakeholders, and subject matter experts to gather and interpret requirements for new data initiatives. Translate business questions into data models that can answer these questions.
Data Integration: Work closely with data engineers to integrate data from multiple sources, ensuring consistency, accuracy, and reliability. Map data flows and document relationships between datasets.
Database Architecture: Design and optimize database schemas using the medallion architecture which includes relational, star schema and denormalized data sets for BI and ML data consumers.
Metadata Management: Team with the data governance team so detailed documentation on data definitions, data lineage, and data quality statistics are available to data consumers.
Data Quality Assurance: Establish master data management data modeling so the history of how customer, provider and other party data is consolidated into a single version of the truth.
Collaboration and Communication: Serve as a bridge between technical teams and business units, clearly communicating the value and limitations of various data sources and structures.
Continuous Improvement: Stay abreast of emerging trends in data modeling, analytics platforms, and big data technologies. Recommend enhancements to existing data models and approaches.
Performance Optimization: Monitor and optimize data models for query performance and scalability. Troubleshoot and resolve performance bottlenecks in collaboration with database administrators.
Governance and Compliance: Ensure that data models and processes adhere to regulatory standards and organizational policies regarding privacy, access, and security.
GCP Data Architect is based in Dearborn, MI. A great opportunity to experience the corporate environment leading personal career growth.
Resume Submittal Instructions: Interested/qualified candidates should email their word formatted resumes to Vasavi Konda - vasavi.konda(.@)stgit.com and/or contact @(Two-Four-Eight) Seven- One-Two - Six-Seven-Two-Five (@*************. In the subject line of the email please include: First and Last Name: GCP Data Architect.
For more information about STG, please visit us at **************
Sincerely,
Vasavi Konda| Recruiting Specialist
“Opportunities don't happen, you create them.”
Systems Technology Group (STG)
3001 W. Big Beaver Road, Suite 500
Troy, Michigan 48084
Phone: @(Two-Four-Eight) Seven- One-Two - Six-Seven-Two-Five: @************(O)
Email: vasavi.konda(.@)stgit.com
Software Engineer (Remote)
Chicago, IL jobs
Remote (proximity to Chicago, Nashville or Manhattan would be a big plus)
Regular travel is not required but will need to travel to corporate office 2 times a year
Our client is looking to add a Software Developer that will be responsible for designing, developing, and maintaining high-quality software solutions that support the Firm's digital platforms. This role ensures the stability, scalability, and performance of all applications and services, while collaborating with cross-functional teams to drive continuous improvement in development practices and operational efficiency.
Responsibilities
Design and implement stable, scalable, and extensible software solutions.
Ensure adherence to secure software development lifecycle (SDLC) best practices and standards.
Drive the design and development of services and applications to meet defined service level agreements (SLAs).
Work closely with end users and stakeholders to gather requirements and iterate on solutions that deliver business value.
Proactively identify and resolve any obstacles affecting operational efficiency and service continuity.
Provide ongoing support for developed applications and services, ensuring timely issue resolution.
Participate in the Firm's change and incident management processes, adhering to established protocols.
Software Development & Architecture
Develop and maintain features for web-enabled applications using C# .NET Core.
Write clean, scalable code with a focus on maintainability and performance.
Implement robust, efficient SQL-based solutions, preferably using MS SQL.
Develop and maintain user interfaces using modern frameworks, preferably Angular or Blazor.
Ensure solutions are designed with an emphasis on security, efficiency, and optimization.
Contribute to continuous integration and continuous delivery (CI/CD) pipelines, automating processes where possible.
Collaboration & Optimization
Collaborate closely with business analysts, quality assurance, and other developers to ensure solutions meet both functional and non-functional requirements.
Foster a culture of positive, open communication across diverse teams, with a focus on collaboration and shared goals.
Engage in regular reviews and feedback sessions to drive continuous improvement in development processes and practices.
Provide mentorship and guidance to junior developers where appropriate, supporting their professional growth.
Professional Conduct
Demonstrates commitment to the firm's core values, including Accountability, Integrity, Excellence, Grit, and Love.
Ensures all activities align with business objectives and project timelines.
Communicates effectively, openly exchanging ideas and listening with consideration.
Maintains a proactive, solution-oriented mindset when addressing challenges.
Takes ownership of responsibilities and holds others accountable for their contributions.
Continuously seeks opportunities to optimize processes, improve performance, and drive innovation.
Qualifications
1-3+ years of expertise in C# .NET Core development
Competence in SQL, preferably MS SQL
Competence in UI work, preferably Angular and/or Blazor
Strong structured problem-solving skills, with a history of using systematic and fact-based processes to improve mission-critical services.
A focus on optimization and efficiency in processes.
Experience working in a financial services firm would be a big plus
Demonstrated expertise in fostering a culture of positive collaboration among cross-functional teams with diverse personalities, skill sets, and levels of experience.
Highly developed communication skills
A sense of urgency and a bias for action.
For all non-bonus, non-commission direct hire positions: The anticipated salary range for this position is ($95,000 - $120,000). Actual salary will be based on a variety of factors including relevant experience, knowledge, skills and other factors permitted by law. A range of medical, dental, vision, retirement, paid time off, and/or other benefits are available.
AI Engineer
Milwaukee, WI jobs
AI Engineer - New Resources Consulting
Summary: New Resources Consulting is looking for a hands-on AI Engineer to join our growing AI Practice. In this role, you'll work closely with the AI Practice Director, business teams, and technical teams to help build and deploy AI solutions. You'll be involved in projects from start to finish, using cloud platforms like Microsoft Azure to move ideas from development into production.
We're looking for someone with strong software development skills and a good understanding of machine learning. This role is a great fit for someone who enjoys problem-solving, collaborating with others, and staying hands-on in both building and deploying AI-driven systems.
This role is primarily remote but will require occasional on-site visits in the Greater Milwaukee area, about 1-2 times per quarter.
Qualifications
Bachelor's degree or higher in Computer Science, Data Science, or related field.
Experience helping develop and implement AI solutions, including moving models into production.
Comfort working directly with clients and team members to understand business needs and translate them into technical solutions.
Hands-on software development experience, building applications and integrating AI into production systems.
Strong programming skills in Python and/or C#.
Understanding of machine learning concepts, algorithms, and deployment practices.
Experience working with Azure AI services, such as:
Azure Cognitive Search
Azure Machine Learning
Azure Functions and Logic Apps
Familiarity with Natural Language Processing (NLP) and Large Language Models (LLMs).
Experience with MLOps, version control tools like Azure DevOps and GitHub, and Agile development practices.
Knowledge of ML frameworks such as LangChain, SciKit-Learn, and Weights & Biases.
Strong communication skills to explain technical concepts clearly to both technical and non-technical audiences.
Benefits of Working at New Resources Consulting
Join the largest locally owned consulting firm in Wisconsin and work on impactful, cutting-edge projects.
Enjoy a flexible work environment with remote options and in-person collaboration alongside a talented, growing AI team.
Be part of a company recognized as a Best Places to Work, Top Workplace, and Best and Brightest award winner.
DevOps Engineer w Virtual Machine
Dearborn, MI jobs
Title: DevOps Engineer w Virtual Machine
Description: STG is a fast-growing Digital Transformation services company providing Fortune 500 companies with Digital Transformation, Mobility, Analytics and Cloud Integration services in both information technology and engineering product lines. STG has a 98% repeat business rate from existing clients and have achieved industry awards and recognition for our services.
Skills Required:
Scripting, Automation
Root Cause Analysis
Troubleshooting (Problem Solving)
Cloud Architecture
IT Solutions
GitHub
Cloud Infrastructure
Change Management
Technical Analysis / Developer
Tekton
Utilization Management
Kubernetes
Experience Required:
Conduct capacity planning and forecasting for the OpenShift Virtualization platform, including compute, memory, storage, and network resources, to ensure scalability and prevent resource exhaustion.
Analyze resource utilization trends and make recommendations for infrastructure scaling, consolidation, or optimization.
Collaborate with application teams and stakeholders to understand future demand and project capacity needs.
Develop and maintain capacity models and reports to support strategic planning.
Develop automation solutions (scripts, playbooks) for repetitive OSV tasks, including configuration changes, VM management, auditing, remediation and integration with ticketing systems.
Leverage automation to enable delivering operator updates and changes efficiently at scale.
Implement Site Reliability Engineering (SRE) principles and practices to improve overall platform stability, performance, and operational efficiency.
Role Based Access Control deployment and auditing.
Namespace and Resource Quota management.
Implement and maintain comprehensive end to end observability solutions (monitoring, logging, tracing) for the OSV environment, including integration with tools like Dynatrace and Prometheus/Grafana.
Explore and implement Event Driven Architecture (EDA) for enhanced real time monitoring and response.
Develop capabilities to flag and report abnormalities and identify "blind spots" in observability.
Perform deep dive Root Cause Analysis (RCA), potentially utilizing available tooling, to quickly identify and resolve issues across the global compute environment.
Find the needle in a haystack/unhealthy bit in the compute universe (Globally) for faster time to resolution.
Monitor VM health, resource usage, and performance metrics proactively.
Monitor for unusual activity that might indicate a compromise or misconfiguration.
Solution Design & Consulting
Knowledge Management
DevOps Engineer w Virtual Machine position is based at our corporate office located in Dearborn, Michigan. A great opportunity to experience the corporate environment leading personal career growth.
Resume Submittal Instructions: Interested/qualified candidates should email their word formatted resumes to Ms. Shweta Huria at ********************** and/or contact at ************. In the subject line of the email please include: First and Last Name (DevOps Engineer w Virtual Machine).
For more information about STG, please visit us at **************
BPM Configuration Engineer
Troy, MI jobs
This Configuration Engineer will be responsible for the configuration build out of mortgage related business processes. This includes configuring workflows, the creation of automated decision points, tasks, managing system users. They will work with our Business Process Management team to support loan servicing business partners, vendors, and data providers as directed by the Technology and Product Development leadership. Strong consultative skills, root cause analysis, strong understanding of data modeling, providing solutions and alternative methods to meet internal client expectations, and have an in-depth understanding of process workflow and data integrations, such as API.
ESSENTIAL DUTIES and RESPONSIBILITIES,
includes the following responsibilities, but not limited to
:
Create and manage automated workflow solutions in a low-code environment including developing program modules.
Supports data reporting partners and data integration partners in testing and providing insight to the data structure in the Business Management Model.
Recommend and facilitate system enhancements to improve efficiencies throughout the servicing organization.
Supports internal and external partners in resolving defects by triaging issues, identifying root cause failures, and provide solutions to facilitate a fix.
Ability to learn and master low code platform from the perspective of both end-users and engineers.
Develops and maintains workflow automations and 3rd party integrations via API from data providers and internal data owners.
Coordinates with business partners and vendors to execute requirements.
Supports training teams in understanding the workflow automation.
Support internal customers to provide a positive technical experience.
Interface with other departments as necessary to ensure the smooth operation and growth of the organization.
Design, documents, managed testing and delivers solutions for assigned program modules.
Other projects and assignments as needed.
QUALIFICATIONS AND EXPERIENCE
Bachelor's degree in science or equivalent experience
2 - 5 of Years in SaaS application deployment and/or similar experience.
Proficient in a programming query language.
Be able to read & understand API documentation and versed in API authentication methods including OAuth, Basic Auth, Tokens, and SAML
Proficient in working with REST APIs and in a major programming language.
Understanding of Mortgage Servicing processes including default servicing
Ability to work in a fast-paced fluid environment.
Excellent communication skills both written and verbal.
Ability to work independently and as a member of various teams and committees.
Commitment to excellence and high standards.
Staff Data Scientist, AI Products
Remote
Role Description
How many times do you get the opportunity to be on the ground floor of a big and important mission? What if you could be one of the top contributors defining the mission, guiding our teams, and influencing the direction of Dropbox's AI-first journey? As a Staff Data Scientist for this new division, you will get to do exactly that. You will join a team of top-tier data scientists and become an integral part of the product organization, helping to scale this new business.
Joining on the ground floor of this startup team, you'll partner directly with the Head of Data Science and Product, Engineering and Design leadership to shape the product roadmap, foster a top-tier, data-informed culture, and drive real AI/ML impact and execution along the way!
Responsibilities
Partner with Product Engineers and Data Engineers to build the reliable, efficient, and scalable data foundations, tools, and processes to drive our AI/ML capabilities' long-term growth
Leverage data-driven insights to proactively identify most impactful opportunities, and directly influence product roadmaps and strategies
Perform exploratory and deep-dive analysis to understand user workflows and engagement patterns on AI features, propose hypothesis, and design & execute experiments with great rigor and efficient data techniques
Translate complex data insights into implications and recommendations for the business via excellent communication skills, both verbal and written
Identify what matters most and prioritize ruthlessly for the area you will own
Contribute to a culture of strong technical ownership, coach junior Data scientists in the team, and partner with Head of Data Science to keep evolving DS working model and elevate DS impact
Work with cross-functional teams (including Product, Engineering, Design, User Research, and senior executives) to rapidly execute and iterate
Requirements
Bachelors' or above in quantitative discipline: Statistics, Applied Mathematics, Economics, Computer Science, Engineering, or related field
9+ years experience of leveraging data-driven analysis to influence product roadmap and business decision, preferably in a tech company
Proven track record of being able to work independently, driver measurable business impact, and proactively engage with business stakeholders with minimal direction
Proficiency in SQL, Python or other programming/scripting languages
Deep understanding of statistical analysis, experimentation design, and common analytical techniques like regression, decision trees
Ability to provide data insights and recommendations for 0→1 product even when sample size is small
Strong verbal and written communication skills
Preferred Qualifications
Experience in startups or building 0→1 products
Expertise in using data to inform AI/ML product development
Background in SaaS product and growth analytics
Compensation US Zone 1$219,300-$296,700 USDUS Zone 2$197,400-$267,000 USDUS Zone 3$175,400-$237,400 USD
Auto-ApplyStaff Data Scientist
Remote
Role Description
We're looking for a Staff Data Scientist to partner with product, engineering, and design teams to answer key questions and drive impact in the Core Experience and Artificial Intelligence (AI) areas. This area focuses on improving key part of the core product through re-envisioning the home experience, cross platform experience, user onboarding, and building new functionality and launching high impact initiatives. We solve challenging problems and boost business growth through a deep understanding of user behaviors with applied analytics techniques and business insights. An ideal candidate should have robust knowledge of consumer lifecycle, behavior analysis, and customer segmentation. We're looking for someone who can bring opinions and strong narrative framing to proactively influence the business.
Responsibilities
Perform analytical deep-dives to analyze problems and opportunities, identify the hypothesis and design & execute experiments
Inform future experimentation design and roadmaps by performing exploratory analysis to understand user engagement behavior and derive insights
Create personalized segmentation strategies leveraging propensity models to enable targeting of offers and experiences based on user attributes
Identify key trends and build automated reporting & executive-facing dashboards to track the progress of acquisition, monetization, and engagement trends.
Identify opportunities, advocate for new solutions and build momentum cross-functionally to move ideas forward tha tare grounded in data.
Monitor and analyze a high volume of experiments designed to optimize the product for user experience and revenue & promote best practices for multivariate experiments
Translate complex concepts into implications for the business via excellent communication skills, both verbal and written
Understand what matters most and prioritize ruthlessly
Work with cross-functional teams (including Data Science, Marketing, Product, Engineering, Design, User Research, and senior executives) to rapidly execute and iterate
Requirements
Bachelors' or above in quantitative discipline: Statistics, Applied Mathematics, Economics, Computer Science, Engineering, or related field
8+ years experience using analytics to drive key business decisions; examples include business/product/marketing analytics, business intelligence, strategy consulting
Proven track record of being able to work independently and proactively engage with business stakeholders with minimal direction
Significant experience with SQL
Deep understanding of statistical analysis, experimentation design, and common analytical techniques like regression, decision trees
Solid background in running multivariate experiments to optimize a product or revenue flow
Strong verbal and written communication skills
Strong leadership and influence skills
Proficiency in programming/scripting and knowledge of statistical packages like R or Python is a plus
Preferred Qualifications
Product analytics experience in a SAAS company
Masters' or above in a quantitative discipline: Statistics, Applied Mathematics, Economics, Computer Science, Engineering, or related field
Compensation
US Zone 1
This role is not available in Zone 1
US Zone 2$197,400-$267,000 USDUS Zone 3$175,400-$237,400 USD
Auto-ApplyData Engineer, Senior - Vital CDM (Vitalware)
Remote
Join one of the nation's leading and most impactful health care performance improvement companies. Over the years, Health Catalyst has achieved and documented clinical, operational, and financial improvements for many of the nation's leading healthcare organizations. We are also increasingly serving international markets. Our mission is to be the catalyst for massive, measurable, data-informed healthcare improvement through:
Data: integrate data in a flexible, open & scalable platform to power healthcare's digital transformation
Analytics: deliver analytic applications & services that generate insight on how to measurably improve
Expertise: provide clinical, financial & operational experts who enable & accelerate improvement
Engagement: attract, develop and retain world-class team members by being a best place to work
Data Engineer, Senior - Vital CDM (Vitalware)
Department: Product Development
Reports To: Manager, Data Engineering
Employment Type: Full-time
Location: Remote, US
Position Overview
The Senior Data Engineer supports the Product Development department and is responsible for working with a team of web application and data engineers to implement database solutions for long term scalability, reliability and performance in a multi-platform, SaaS environment, leveraging both RDBMS and NoSQL Solutions. The position requires cross-team communication, attention to detail and the ability to develop innovative technologies and approaches for building high availability data persistent systems. The Senior Data Engineer takes direction from the Manager, Data Engineering.
This position includes helping scale and refactor an existing public facing website database and related services, moving resources to Azure, creating and releasing new features for the product, and managing related backend data services. This is a small dynamic team and a great opportunity to shape the future of a meaningful product. You will be expected to help support and expand the current product while thinking about scale and the future with Azure PaaS.
What you'll own in this role:
Implement features in collaboration with Product Managers and developers within Agile / Scrum methodology.
Build solutions that are automated, scalable, and sustainable while minimizing defects and technical debt.
Evaluate and analyze the current system architecture. Create scalable solutions to improve uptime and responsiveness while providing direction for future state.
Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements and comply with all applicable standards.
Research, identify, analyze and correct any technical issues receiving claim transactions and/or provider data.
Resolve complex data issues and perform quality data checks
Receive and understand business requirements and create data mapping specifications
Integrate client's data into our product suite
Maintain and optimize several complex databases.
Investigate and troubleshoot complicated database applications and stability issues.
Ensure MSSQL databases are operational and provide valid and relevant data.
Guide our efforts in all areas of database design, performance, and reliability.
Participate in code reviews that include database changes and effectively communicate issues and risks.
Integrate new products and software packages and ensure data produced is accurate.
Optimize code for maximum scalability and maintainability.
Incorporate unit testing and regression testing to ensure defect-free builds and releases.
What you'll bring to this role:
BS or MS in Computer Science or equivalent professional experience.
6+ years MSSQL Server and/or RDBMS experience with current technology required.
6+ years SQL optimization experience required (Index optimization strategies, Data normalization/de-normalization strategies, Plan analysis, Recompilation, Caching and buffering, Optimization tools including SQL Server Extended Events or similar, Statistics and their role).
3+ years of experience with high transaction OLTP environment with 4+ TB in size.
A solid understanding of data structures (e.g., XML/SGML/DTD/JSON).
A solid understanding of parsing and transforming JSON data in SQL Server.
Experience writing complex and efficient SQL stored procedures.
Deep SQL Server working knowledge including order of operations, transactions and concurrency, file tables and security, brokering technologies, transactional replication, indexing strategies and maintenance, backup and recovery models, multi-node clustering and high availability.
Familiar with Git and branching strategies.
Familiar with creating and/or consuming REST APIs using C#, NodeJS, Python, etc.
Familiar with NoSQL (MongoDB and/or Elasticsearch).
Azure knowledge highly desired.
Demonstrable experience implementing enterprise-scale, high volume, high availability systems.
Demonstrated ability to deliver major critical projects.
Experience with Agile and Scrum team development environments
Skill and Ability Requirements
Must be well organized, accurate and detail oriented.
Excellent written and verbal communication with technical and non-technical staff.
Ability to work in complex code bases written by others.
Strong organizational, presentation, interpersonal and consultative skills a must.
Ability to manage multiple projects/tasks simultaneously.
Good judgment and decision-making skills.
Enthusiastic about sharing knowledge and experience.
Maintains a positive and results-oriented attitude.
NOTE: This job description is not intended to be all-inclusive. Applicants may perform other related duties as negotiated to meet the ongoing needs of the organization.
The above statements describe the general nature and level of work being performed in this job function. They are not intended to be an exhaustive list of all duties, and indeed additional responsibilities may be assigned by Health Catalyst.
Studies show that candidates from underrepresented groups are less likely to apply for roles if they don't have 100% of the qualifications shown in the job posting. While each of our roles have core requirements, please thoughtfully consider your skills and experience and decide if you are interested in the position. If you feel you may be a good fit for the role, even if you don't meet all of the qualifications, we hope you will apply. If you feel you are lacking the core requirements for this position, we encourage you to continue exploring our careers page for other roles for which you may be a better fit.
At Health Catalyst, we appreciate the opportunity to benefit from the diverse backgrounds and experiences of others. Because of our deep commitment to respect every individual, Health Catalyst is an equal opportunity employer.
Auto-Apply
at Spiceworks
Who you are:We are looking to hire an experienced Data Engineer to join our dynamic engineering team. You will be a part of a global data engineering team, spread across the various United States and India based Spiceworks Ziff Davis offices, and will play a key role in enabling our data-driven strategy.
As a Data Engineer, you'll work closely with Product Managers, Finance and Business Analysts, Data Scientists, and Software Engineers as you help define our data development roadmap. Eager to jump into a pivotal role that's essential in enabling data-informed decisions? It's time to apply! What You'll Do:
Collaborate with product team and contribute to software design
Partner or lead in the development of self-monitoring, robust, scalable, batch and streaming ETL processes
Develop and improve the continuous integration and testing processes
Perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions
Contribute to the consolidation of our data and empower other teams to derive value from our warehouse ecosystem
Work closely with stakeholders on the data supply side (e.g. application engineers) and data demand side (e.g., BI analysts, data scientists, product managers, and other engineering teams)
Ensure data privacy and compliance (GDPR), as well as maintaining data dictionaries for governance purposes
What You Need:
2 - 4 years of experience working on data pipelines and managing resulting data stores (experience with Snowflake Data Warehouse, Redshift a plus)
Understanding of building data models, SQL and query optimization
Hands on experience in Python
Working on structured and unstructured datasets
Skilled problem solver, can identify performance and data quality issues
Academic, certification, or work experience applying analytic skill sets
Academic, certification, or work exposure to managed cloud services (AWS preferred)
Prior work in agile/scrum development environments a plus
Exposure to one or more orchestration technologies a plus (Matillion, Snowflake, PySpark, Redshift, Athena, Airflow, Talend, Kinesis, Hive, Firehose, Kafka ..)
Bachelor's or master's in engineering, computer science, or related field preferred.
Perks and Benefits of working at SWZD
Unlimited flexible time off
Volunteer paid time off
Generous holidays
Recognition Programs - yes there are multiple, AND prizes!
Remote-first work experience and a work from anywhere culture
Weekly coffee chats with leaders
Wellness and mindfulness programs
Career advancement opportunities
Sick days
Generous Paid Parental Leave
Bi-weekly Townhalls
Regional holiday celebrations
Community give-back celebrations
Transportation Assistance for night-shift employees
Unwavering commitment to diversity and inclusion
Employee stock purchase program
***In terms of logistics, all interviews will take place over phone or video due to our Work From Anywhere policy***
Who we are:
Spiceworks Ziff Davis (SWZD) is a trusted global marketplace that connects technology buyers and sellers with the most actionable and precise intent data. We are uniquely positioned to offer tech brands unmatched visibility into accounts that are truly in-market, by leveraging our scale, quality and diversity of intent data. With unparalleled access to the world's most influential technology buyers through a combination of first-party (Community, Tools, Editorial) and third-party intent data, SWZD is a leader in intent-backed, intelligent, omnichannel marketing.
Auto-ApplySenior Data Engineer
Remote
Push the boundaries of tech. In your sweatpants.
We're looking for an experienced Senior Data Engineer to help us change how the world works. Here, you'll be part of our Data Engineering & Analytics team, supporting cross-functional groups around the world. The right candidate will develop, review, and maintain data infrastructure and various data flows. You will also develop means to ensure continuous data validation and telemetry for all data processes.
The top creative and technical minds could work
anywhere.
So why are so many of them choosing Corel? Here are three reasons:
This is the moment. It's an exciting time at Corel, with new leadership, a refreshed brand, and a whole new approach to changing the way the world works. We're at the forefront of a movement, and we want you to ride this wave with us.
We want
you
to be
you
. Too often, companies tell you about their culture and then expect you to fit it. Our culture is built from the people who work here. We want you to feel safe to be who you are, take risks, and show us what you've got.
It's your world. We know you have a life. We want to be part of it, but not all of it. At Corel, we're serious about empowering people to work when, how, and where they want. Couch? Sweatpants? Cool with us. We believe that happy employees mean happy customers. That's why we hire amazing people and get out of their way.
Sound good so far?
Awesome.
Let's talk more about the Senior Data Engineer role and see if we're destined to be together.
As a Senior Data Engineer, you will:
Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines for Data Lake and Data Warehouse
Build and implement ETL frameworks to improve code quality and reliability
Build and enforce common design patterns to increase code maintainability
Ensure accuracy and consistency of data processing, results, and reporting
Design cloud-native data pipelines, automation routines, and database schemas that can be leveraged to do predictive and prescriptive machine learning
Communicate ideas clearly, both verbally and through concise documentation, to various business sponsors, business analysts and technical resources
Guide and mentor other Data Engineers as a technical owner of parts of the data platform
What YOU bring to the team:
Expert knowledge of Python
Expert knowledge of SQL
Experience with DevOps mode of work
7+ years of professional experience
5+ years of experience working in data engineering, business intelligence, or a similar role
5+ years of experience in ETL orchestration and workflow management tools like Airflow, flink, etc. using AWS/GCP
3+ years of experience with the distributed data processing tools like Spark, Presto, etc. and streaming technologies such as Kafka/Flink
3+ years of experience with SnowFlake (preferred) or another big data database platform
3+ years of experience with cloud service providers: Amazon AWS (preferred), or one of the other major clouds
Expertise with containerization orchestration engines (Kubernetes)
MS in Computer Science, Software Engineering, or relevant field preferred, BS in one of same fields acceptable
Our Team:
Corel features award-winning solutions that have helped millions of users for 40 years (can you believe it?!).
We offer a fully remote workspace, and we mean it. There is no pressure to work in an office whatsoever.
Hours are flexible! You've worked hard to build your life, and we don't want you to give it up for work.
Our team is growing fast, and there's a
ton
of energy and a lot of really smart, motivated, fun people ready to welcome you in.
What are you waiting for? Apply now! We can't wait to meet you.
(FYI, we're lucky to have a lot of interest and we
so
appreciate your application, though please note that we'll only contact you if you've been selected for an interview.)
About Corel
Corel is a beloved and trusted industry titan fueled by make-everything-easier flexibility. With a 40-year legacy of innovation, we understand where you've been, and we're uniquely equipped to get you where you want to be. Our comprehensive collection of creative, collaborative, and productivity solutions propel your teams on their journey. From meeting your deadlines to realizing your dreams, Corel empowers all you do.
Our products enable millions of connected knowledge workers around the world to do great work faster. Our success is driven by an unwavering commitment to deliver a broad portfolio of innovative applications - including CorelDRAW , MindManager , Parallels , and WinZip - to inspire users and help them achieve their goals.
It is our policy and practice to offer equal employment opportunities to all qualified applicants and employees without regard to race, color, age, religion, national origin, sex, political affiliation, sexual orientation, marital status, disability, veteran status, genetics, or any other protected characteristic.
Corel is committed to an inclusive, barrier-free recruitment and selection process and work environment. If you are contacted for a job opportunity, please advise us of any accommodation that are required.
Appropriate accommodation will be provided upon request as required by Federal and Provincial regulations and Company Policy. Any information received relating to accommodation will be treated as confidential.
#LI-Remote
Auto-ApplySenior Data Engineer
Remote
Push the boundaries of tech. In your sweatpants. We're looking for an experienced Senior Data Engineer to help us change how the world works. Here, you'll be part of our Data Engineering & Analytics team, supporting cross-functional groups around the world. The right candidate will develop, review, and maintain data infrastructure and various data flows. You will also develop means to ensure continuous data validation and telemetry for all data processes.
The top creative and technical minds could work anywhere. So why are so many of them choosing Corel? Here are three reasons:
* This is the moment. It's an exciting time at Corel, with new leadership, a refreshed brand, and a whole new approach to changing the way the world works. We're at the forefront of a movement, and we want you to ride this wave with us.
* We want you to be you. Too often, companies tell you about their culture and then expect you to fit it. Our culture is built from the people who work here. We want you to feel safe to be who you are, take risks, and show us what you've got.
* It's your world. We know you have a life. We want to be part of it, but not all of it. At Corel, we're serious about empowering people to work when, how, and where they want. Couch? Sweatpants? Cool with us. We believe that happy employees mean happy customers. That's why we hire amazing people and get out of their way.
Sound good so far? Awesome. Let's talk more about the Senior Data Engineer role and see if we're destined to be together.
As a Senior Data Engineer, you will:
* Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines for Data Lake and Data Warehouse
* Build and implement ETL frameworks to improve code quality and reliability
* Build and enforce common design patterns to increase code maintainability
* Ensure accuracy and consistency of data processing, results, and reporting
* Design cloud-native data pipelines, automation routines, and database schemas that can be leveraged to do predictive and prescriptive machine learning
* Communicate ideas clearly, both verbally and through concise documentation, to various business sponsors, business analysts and technical resources
* Guide and mentor other Data Engineers as a technical owner of parts of the data platform
What YOU bring to the team:
* Expert knowledge of Python
* Expert knowledge of SQL
* Experience with DevOps mode of work
* 7+ years of professional experience
* 5+ years of experience working in data engineering, business intelligence, or a similar role
* 5+ years of experience in ETL orchestration and workflow management tools like Airflow, flink, etc. using AWS/GCP
* 3+ years of experience with the distributed data processing tools like Spark, Presto, etc. and streaming technologies such as Kafka/Flink
* 3+ years of experience with SnowFlake (preferred) or another big data database platform
* 3+ years of experience with cloud service providers: Amazon AWS (preferred), or one of the other major clouds
* Expertise with containerization orchestration engines (Kubernetes)
* MS in Computer Science, Software Engineering, or relevant field preferred, BS in one of same fields acceptable
Our Team:
* Corel features award-winning solutions that have helped millions of users for 40 years (can you believe it?!).
* We offer a fully remote workspace, and we mean it. There is no pressure to work in an office whatsoever.
* Hours are flexible! You've worked hard to build your life, and we don't want you to give it up for work.
* Our team is growing fast, and there's a ton of energy and a lot of really smart, motivated, fun people ready to welcome you in.
What are you waiting for? Apply now! We can't wait to meet you.
(FYI, we're lucky to have a lot of interest and we so appreciate your application, though please note that we'll only contact you if you've been selected for an interview.)
About Corel
Corel is a beloved and trusted industry titan fueled by make-everything-easier flexibility. With a 40-year legacy of innovation, we understand where you've been, and we're uniquely equipped to get you where you want to be. Our comprehensive collection of creative, collaborative, and productivity solutions propel your teams on their journey. From meeting your deadlines to realizing your dreams, Corel empowers all you do.
Our products enable millions of connected knowledge workers around the world to do great work faster. Our success is driven by an unwavering commitment to deliver a broad portfolio of innovative applications - including CorelDRAW, MindManager, Parallels, and WinZip - to inspire users and help them achieve their goals.
It is our policy and practice to offer equal employment opportunities to all qualified applicants and employees without regard to race, color, age, religion, national origin, sex, political affiliation, sexual orientation, marital status, disability, veteran status, genetics, or any other protected characteristic.
Corel is committed to an inclusive, barrier-free recruitment and selection process and work environment. If you are contacted for a job opportunity, please advise us of any accommodation that are required. Appropriate accommodation will be provided upon request as required by Federal and Provincial regulations and Company Policy. Any information received relating to accommodation will be treated as confidential.
#LI-Remote
ETL Architect
Southfield, MI jobs
360 IT Professionals is a Software Development Company based in Fremont, California that offers complete technology services in Mobile development, Web development, Cloud computing and IT staffing. Merging Information Technology skills in all its services and operations, the company caters to its globally positioned clients by providing dynamic feasible IT solutions. 360 IT Professionals work along with its clients to deliver high-performance results, based exclusively on the one of a kind requirement.
Our services are vast and we produce software and web products. We specialize in Mobile development, i.e. iPhone and Android apps. We use Objective C and Swift programming languages to create native applications for iPhone, whereas we use Android Code to develop native applications for Android devices. To create applications that work on cross-platforms, we use a number of frameworks such as Titanium, PhoneGap and JQuery mobile.
Furthermore, we build web products and offer services such as web designing, layouts, responsive designing, graphic designing, web application development using frameworks based on model view controller architecture and content management system. Our services also extend to the domain of Cloud Computing, where we provide Salesforce CRM to effectively manage one's business and ease out all the operations by giving an easy platform. Apart from this, we also provide IT Staffing services that can help your organization to a great extent as you can hire highly skilled personnel's through us.
We make sure that we deliver performance driven products that are optimally developed as per your organization's needs. Take a shot at us for your IT requirements and experience a radical change.
Job Description
Position: ETL Architect
Location: Southfield, MI
Duration: Contract to hire
Need candidates on W2 only
15-17 yrs. Experience
·
This person will lead teams and work with management and executives
·
Must have excellent communication
·
This person is not hands on but must be able to speak to and understand how things work (Healthcare)
·
Must have 3-4 yrs. as an architect and be able to show their career progression
·
Cognos and Business Objects are nice to have
The Informatica ETL Architect has the overall responsibility for assessing requirements and defining the strategy, technical architecture, implementation plan and delivery of data warehouse projects. The Informatica ETL Architect must have prior experience completing successful data warehousing implementations as well as broad background and experience with IT application development. This individual is responsible for establishing the long term strategy and technical architecture as well as the short term scope for a multi-phased data warehouse effort and should have strong professional consulting skills and the ability to communicate well at all levels of the organization. Top Skill Set:• Lead ETL architecture and design as well as data flow diagramming• Define and implement ETL development standards & procedures• Ensure ETL quality through code reviews and throughout inspection & knowledge sharing• At least 12 years experience with Informatica in a Developer/Tech Lead role• Knowledge of Health Care Insurance Payer Data Warehousing Preferred• Ability to develop a technical work plan and assign work and coordinate across multiple developers and projects. Required Skills/Experience• 12 to 16 Years of Informatica ETL development experience• At least 4 years experience as Informatica ETL Architect• At least 8-10 years experience with Informatica in a Developer/Tech Lead role• Mastery in data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during the interview and demonstrate previous experience in all aspects.• MUST HAVE strong SQL skills in Oracle Partitioned Environment• Experience in Business Intelligence reporting tools like Cognos and Business Objects preferred• Experience in Oracle database programming using Partitioning, Materialized View and OLAP• Experience in tuning Oracle queries/processes and performance management tools TOOLS & TECHNOLOGIES Informatica 8.x and above (9.1 Preferred) PowerCenter PowerExchange Data Quality Oracle 10g and above Unix Shell Scripting (AIX, Linux) Scheduling Tools (any one of Tivoli, Autosys and Ctrl-M) SQL
Additional Information
Regards,
Vishal Rana
Talent & Client Acquisition Specialist
Phone: 510 254 3300 Ext 178 |
ETL Architect
Southfield, MI jobs
360 IT Professionals is a Software Development Company based in Fremont, California that offers complete technology services in Mobile development, Web development, Cloud computing and IT staffing. Merging Information Technology skills in all its services and operations, the company caters to its globally positioned clients by providing dynamic feasible IT solutions. 360 IT Professionals work along with its clients to deliver high-performance results, based exclusively on the one of a kind requirement.
Our services are vast and we produce software and web products. We specialize in Mobile development, i.e. iPhone and Android apps. We use Objective C and Swift programming languages to create native applications for iPhone, whereas we use Android Code to develop native applications for Android devices. To create applications that work on cross-platforms, we use a number of frameworks such as Titanium, PhoneGap and JQuery mobile.
Furthermore, we build web products and offer services such as web designing, layouts, responsive designing, graphic designing, web application development using frameworks based on model view controller architecture and content management system. Our services also extend to the domain of Cloud Computing, where we provide Salesforce CRM to effectively manage one's business and ease out all the operations by giving an easy platform. Apart from this, we also provide IT Staffing services that can help your organization to a great extent as you can hire highly skilled personnel's through us.
We make sure that we deliver performance driven products that are optimally developed as per your organization's needs. Take a shot at us for your IT requirements and experience a radical change.
Job Description
Position: ETL Architect
Location: Southfield, MI
Duration: Contract to hire
Need candidates on W2 only
15-17 yrs. Experience
· This person will lead teams and work with management and executives
· Must have excellent communication
· This person is not hands on but must be able to speak to and understand how things work (Healthcare)
· Must have 3-4 yrs. as an architect and be able to show their career progression
· Cognos and Business Objects are nice to have
The Informatica ETL Architect has the overall responsibility for assessing requirements and defining the strategy, technical architecture, implementation plan and delivery of data warehouse projects. The Informatica ETL Architect must have prior experience completing successful data warehousing implementations as well as broad background and experience with IT application development. This individual is responsible for establishing the long term strategy and technical architecture as well as the short term scope for a multi-phased data warehouse effort and should have strong professional consulting skills and the ability to communicate well at all levels of the organization. Top Skill Set:• Lead ETL architecture and design as well as data flow diagramming• Define and implement ETL development standards & procedures• Ensure ETL quality through code reviews and throughout inspection & knowledge sharing• At least 12 years experience with Informatica in a Developer/Tech Lead role• Knowledge of Health Care Insurance Payer Data Warehousing Preferred• Ability to develop a technical work plan and assign work and coordinate across multiple developers and projects. Required Skills/Experience• 12 to 16 Years of Informatica ETL development experience• At least 4 years experience as Informatica ETL Architect• At least 8-10 years experience with Informatica in a Developer/Tech Lead role• Mastery in data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during the interview and demonstrate previous experience in all aspects.• MUST HAVE strong SQL skills in Oracle Partitioned Environment• Experience in Business Intelligence reporting tools like Cognos and Business Objects preferred• Experience in Oracle database programming using Partitioning, Materialized View and OLAP• Experience in tuning Oracle queries/processes and performance management tools TOOLS & TECHNOLOGIES Informatica 8.x and above (9.1 Preferred) PowerCenter PowerExchange Data Quality Oracle 10g and above Unix Shell Scripting (AIX, Linux) Scheduling Tools (any one of Tivoli, Autosys and Ctrl-M) SQL
Additional Information
Regards,
Vishal Rana
Talent & Client Acquisition Specialist
Phone: 510 254 3300 Ext 178 |
ETL Architect
Michigan jobs
E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website *****************
We have been retained for providing recruiting assistance, for direct hires, by one of the world-leading information technology consulting, services, and business process outsourcing organization that envisioned and pioneered the adoption of the flexible global business practices that today enable companies to operate more efficiently and produce more value.
Job Title : ETL Lead Designer and Developer
Location : Grand Rapids, MI
Job Type : Full Time
Job Description:
Roles and Responsibilities:
5 to 7 years of hands-on experience in design, development, and implementation of end-to-end ETL solutions for Data Warehouse and Data Marts using Informatica version 9.0 or above.
Provides Technical Leadership to ETL Developers during Design and Development, Testing and Certification.
Well versed with data modeling (i.e., conceptual, logical and physical design - with both traditional 3rd normal form as well as dimensional modeling (star, snowflake).
Translates business requirements into functional and technical specifications to enable and shape the design and development process.
Responsible for end to end ETL design, development, testing, and certification constructs.
Provides detailed estimates for all deliverables related to Data Profiling, ETLs, Testing, and Data Certification tasks.
Works with BI Architect/Project Manager to ensure the on-time and under/within budget delivery of work products.
Seeks for opportunities to reuse, share, consolidate and leverage existing data architectures.
The position assumes technical leadership as well as hands-on contribution to design, development, testing, validation, performance tuning and quality assurance activities related to data sourcing, staging, loading, cleansing, transformation, and it's integration.
Ensures high data quality and appropriate data integrity throughout the project life cycle
Qualifications
Technical/Functional Skills:
Insurance industry experience, especially in Property and Casualty domain, is heavily preferred.
Insurance data model experience, such as IAA and Accord.
Strong knowledge and extensive practical experience with Informatica ETL tool, DB2 (mainframe) and DB2 UDB database management systems, PL/SQL, and handling flat files (mainframe) with respect to ETL processes
Additional Information
If you are interested, please send across your resume to [email protected] OR you can reach me @ ************ X 227
Job Description: 1. 3-5 years in Data platform engineering 2. Experience with CI/CD, laC(Terraform) and containerization with Docker/Kubernetes 3. Hands on experience building backend applications like APIs, Services etc 4. Proven track record in building scalable Data Engineering pipelines using Python, SQL, DBT Core/Cloud.
5.
Experience working with MWAA (Airflow) or similar cloud based Data Engineering Orchestration Tool 6.
Experience working with cloud ecosystems like AWS, Azure or GCP and modern data tools like Snowflake, Databricks etc.
7.
Strong problem solving skills as well as ability to move in a fast pace environment is a plus.
Auto-ApplyJD: Experience with big data processing and distributed computing systems like Spark. • Implement ETL pipelines and data transformation processes. • Ensure data quality and integrity in all data processing workflows. • Troubleshoot and resolve issues related to PySpark applications and workflows.
• Understand source, dependencies and data flow from converted PySpark code.
• Strong programming skills in Python and SQL.
• Experience with big data technologies like Hadoop, Hive, and Kafka.
• Understanding of data warehousing concepts and relational databases like SQL.
• Demonstrate and document code lineage.
• Integrate PySpark code with frameworks such as Ingestion Framework, DataLens, etc.
, • Ensure compliance with data security, privacy regulations, and organizational standards.
• Knowledge of CI/CD pipelines and DevOps practices.
• Strong problem-solving and analytical skills.
• Excellent communication and leadership abilities.
Auto-ApplyAnalytics Data science and IOT Engineer
Remote
Role
Analytics
Data
science
and
IOT
Engineer
Responsibilities
Understanding
the
requirement
and
ability
to
relate
to
statistical
algorithms
Knowing
the
Acceptance
Criteria
and
ways
to
achieve
the
same
Complete
understanding
of
Business
Processes
and
data
Performing
EDA
(Exploratory
Data
Analysis)
cleansing, data preprocessing data munging and create training data sets Using the right Statistical models and other statistical methods Deploying the statistical model using the technology of customers' preference Building Data Pipeline , Machine Learning Pipeline and Monitoring activities are set for Continuous Integration , Continuous Development and Continuous Testing Investigating Statistical Model & provide resolution when there is any data drift and performance issues The Role offers Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development To re-imagine, redesign, and apply technology to add value to the business and operations
Auto-Apply