Mobile Engineer
Dallas, TX jobs
Mobile Engineer/Developer-(Flutter or React Native)
TransPerfect is an established company with a start-up culture seeking creative entrepreneurial people like you to join our team. We're hiring a Mobile Engineer to join our global team to be responsible for crafting top-tier user experiences in the iOS and Android platforms. This full-time hybrid role demands both independent and collaborative work within a high-impact team, focused on delivering innovative products. This role is full-time, and the ideal applicant will be an expert in working in iOS and Android apps built within React Native OR Flutter and building new components from the ground up.
About TransPerfect:
TransPerfect helps organizations navigate the global marketplace. It remains founder-led - and has striven to maintain the ethos and drive that led them to grow organically year-on-year for the past 30 years. Starting in an NYU dorm room in the early 1990s, TransPerfect is now recognized as the largest translation and language services company in the world. From offices in more than 90 cities on six continents, TransPerfect offers a full range of services in 170+ languages to clients worldwide.
You will be part of a technology department that is comprised of a successful and multidisciplinary group of professionals located across the globe - Barcelona, New York, London, Manchester, Bucharest and India. If you're ready to join a growing company and make an immediate impact, we want to hear from you!
About the team you will be joining at TransPerfect - TechOps:
The TransPerfect TechOps team has been a vital part of the company's success since its formation 10 years ago - delivering technology and services that have drastically simplified the lives of our clients and colleagues - from workflow improvements for colleagues to core services that form the basis of the company's GlobalLink platform, to creating scalable client interfaces that allow novice users to navigate complex ecosystems.
Your responsibilities and tasks will include:
Be the subject matter expertise on mobile app development - helping define the team's mobile development practices, design thinking and deployment processes
Collaborate with product managers, engineers, product designers and QA to define and build compelling user experiences on iOS and Android
Own and lead delivery of major mobile app features, components, foundations and overall entire application solutions in partnership with technical leads and senior engineers
Operate in an agile environment, communicate and manage internal and external implementation requirements and expectations
Provide constructive feedback to the technical staff during all phases of the software lifecycle to keep development priorities aligned with business needs
Write clean, efficient, and maintainable code following best practices and coding standards
Stay up-to-date with latest trends, technologies, and best practices in mobile development
The minimum we're looking for (show us what you got if you aren't a perfect fit):
4+ years of experience in mobile applications development & proof of published applications
Proficiency in Flutter, GIT, AZURE; Solid knowledge of Dart
Deep understanding of mobile app UX/UI best practices
Expertise in CI/CD (continuous integration and deployment), testing, and agile methodologies
Experience deploying applications to Apple App Store and Google Play
Experience with Agile methodologies like SCRUM/Kanban
Excellent communication and collaboration skills with a proven ability to work effectively in a team environment
Why you should join us? Let's see:
Competitive salary in line with your experience and knowledge.
You will work in a dynamic, well-functioning software development division with the world's largest provider of language services and technology solutions.
Significant room for growth.
Why TransPerfect:
For more than 25 years, we have honed a culture where all kinds of ideas are shared and new ventures are not only welcomed, but also encouraged. In this fast-paced environment, employees are intellectually stimulated so they can grow alongside the organization. From Intern to President, we believe that every single employee should have a voice and contribute to the amazing services we offer our clients.
We also offer a comprehensive benefits package including medical, dental, and vision insurance, 401k matching, membership to child-care providers, and other TransPerks. You even get your birthday off because let's face it, we're stoked that you were born.
TransPerfect provides equal employment opportunity to all individuals regardless of their race, color, creed, religion, gender, age, sexual orientation, national origin, disability, veteran status, or any other characteristic protected by state, federal, or local law. TransPerfect is committed to all recruitment processes and workplace free from harassment, sexual harassment & discrimination.
For more information on the TransPerfect Family of Companies, please visit our website at *********************
Roadway Engineer
Columbus, OH jobs
Korda/Nemeth Engineering, Inc. is seeking a motivated Civil Engineer for an immediate opening in our Public Works group with a focus on Transportation / Roadway projects. If you are interested in joining a team that provides varied and complex engineering services resulting in value to our clients, this opportunity is for you. Our projects and work environment foster individual creativity, collaborative interaction, and teamwork. Our people make for a great place to work!
Founded in 1964, Korda/Nemeth Engineering, Inc. is a multi-disciplinary, nationally recognized consulting engineering firm providing civil, surveying, transportation, electrical, mechanical, structural, and technology engineering services. Our integrated engineering approach and experience make Korda/Nemeth Engineering the consulting engineering firm of choice for a variety of project types including Healthcare, Sports and Recreation, Science and Technology, Higher Education, High-Rise Buildings, Public Agencies involving Transportation, Bridge, Public Works, and more. (Please visit ***************
Our Public Works group provides design services to state, county, city, and other public agency clients as well as public and private institutions, campuses, private developers, and non-profit organizations. Transportation / Roadway projects within the Public Works group include bridges (new alignment, replacement, rehabilitation), complex urban and rural roadways, streetscapes, intersection improvements and roundabouts, safety improvements, beautification projects, trails/shared use paths, and more. This position involves design, calculation, and plan preparation from concept design through bid drawings with opportunities for construction administration.
To be successful in this position, it is necessary to possess:
Excellent time management and communication skills
Ability to work on multiple projects in a collaborative environment
Good social skills, as you represent our company at client meetings
Willingness to learn new skills and to work towards continuous improvement
Previous experience with public infrastructure projects is preferred but not required
Required Qualifications:
Bachelor's Degree in Civil Engineering
Experience level: 0-5 years
Knowledge of Microstation/Geopak/OpenRoads and/or AutoCAD/Civil3D is desirable
Must be able to provide proof of eligibility to work in the United States and authorized to work on a full-time basis. No visa sponsorship available at this time.
Position includes the following benefits for Regular Full-Time Employees:
Medical and Dental Insurance (employee, spouse, and family)
Life Insurance and Short-Term and Long-Term Disability Insurance
Supplemental Life and Vision Insurance
Paid vacation, sick leave, and eight paid holidays
Salary is commensurate with experience and position. WHAT IS IT LIKE TO WORK AT KORDA? See what our employees and partners have to say: ******************************
IMPORTANT: Please provide resume and cover letter explaining how your experience and background is a match for our needs.
If you require accommodation to complete an online application, please contact Human Resources at ************.
We are an equal opportunity employer. Applicants and employees are free from discrimination on the basis of race, color, religion, sex (including pregnancy, gender identity, and sexual orientation), parental status, national origin, age, disability, genetic information (including family medical history), political affiliation, military service, pregnancy accommodations, reprisal, or other non-merit based factors.
Observability Engineer
Greenwood Village, CO jobs
Our client, an industry leader in telecommunications, has an excellent opportunity for an Observability Engineer to work on a contract opportunity. Work will be a hybrid on-site/remote schedule in Englewood, CO. The Observability Engineer will contribute significantly to planning, implementing, and maintaining system monitoring and observability artifacts for a complex enterprise network. Collaborates closely with developers to integrate observability, encompassing APM, NPM, SNMP monitoring, log aggregation, JVM monitoring, and network device monitoring.
Due to client requirement, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance.
Rate: $50 - $58 / hr. w2
Responsibilities:
This role blends technical proficiency with collaboration, delivering impactful contributions to our systems and infrastructure.
Contribute to revision control using Git, collaborate on network performance monitoring, automated processes through scripting in BASH and Python, and bring expertise in WiFi monitoring and analysis.
Design observability dashboards and alerting for AWS cloud services
Implement WiFi network monitoring serving millions of users
Develop APM, NPM, and SNMP monitoring solutions
Create automated log aggregation and JVM performance monitoring
Collaborate with development teams on observability integration
Automate monitoring workflows using Python and Bash scripting
Maintain monitoring infrastructure using Git version control
Requirements:
Bachelor's degree in Computer Science or equivalent
Experience building dashboards and monitoring solutions w Grafana, Splunk, or Datadog
Basic knowledge in using ticketing and software tools to support current operations.
Basic knowledge of network devices and network appliances.
AWS Cloud Experience
Site Reliability concepts and culture
Please be advised- If anyone reaches out to you about an open position connected with Eliassen Group, please confirm that they have an Eliassen.com email address and never provide personal or financial information to anyone who is not clearly associated with Eliassen Group. If you have any indication of fraudulent activity, please contact ********************.
Skills, experience, and other compensable factors will be considered when determining pay rate. The pay range provided in this posting reflects a W2 hourly rate; other employment options may be available that may result in pay outside of the provided range.
W2 employees of Eliassen Group who are regularly scheduled to work 30 or more hours per week are eligible for the following benefits: medical (choice of 3 plans), dental, vision, pre-tax accounts, other voluntary benefits including life and disability insurance, 401(k) with match, and sick time if required by law in the worked-in state/locality.
JOB ID: JN -112025-104374
Cloud Engineer
Raleigh, NC jobs
About the Role
We are seeking a skilled Azure Cloud Engineer to join our consulting team. This role is ideal for a proactive professional who thrives in dynamic environments and is passionate about designing, implementing, and maintaining robust cloud solutions. You will play a key role in ensuring operational excellence, scalability, and security across our Azure-based infrastructure.
Responsibilities
Monitor and maintain the health, standards, and stability of IaaS, PaaS, and SaaS environments.
Manage incident response and escalations, ensuring timely resolution.
Implement changes following established change management and vulnerability remediation processes.
Participate in disaster recovery planning and testing.
Research and recommend new cloud technologies to address technical challenges.
Build foundational cloud components including networking, security, identity, operations, and governance.
Transition solutions from proof-of-concept to production using automation for repeatability and compliance.
Design cloud architectures that meet scalability, reliability, and performance requirements.
Conduct capacity planning, profiling, load testing, and monitoring of cloud applications with security best practices.
Oversee day-to-day cloud resource management, including allocation, optimization, and configuration.
Collaborate effectively within a team to achieve project goals.
Qualifications
Bachelor's degree in Computer Science, Systems Engineering, or related field.
4+ years of relevant experience in cloud engineering.
Microsoft Azure certifications (AZ-900, AZ-104, AZ-500, AZ-305, AZ-700).
Strong knowledge of Azure IaaS, PaaS, SaaS services and foundational components (compute, storage, networking, identity, security).
Experience with Azure App Services architecture and deployment.
Hands-on experience with Azure services such as API Management, Database Management, Security, Front Door, Application Gateway, and WAF.
Proficiency with Azure DevOps for CI/CD and SDLC processes.
Familiarity with containerization technologies and configurations.
Expertise in high-availability, geo-distributed architectures, and disaster recovery strategies.
Ability to automate deployments using Terraform, PowerShell, CLI, or Bicep, and manage multiple environments via IaC.
Strong communication skills for technical and business discussions.
Experience with other cloud platforms (AWS, GCP) is a plus.
SAP Basis Engineer
Denver, CO jobs
Our client is seeking a SAP Basis Engineer to join their team! This position is located in Denver, Colorado.
Administer and maintain SAP systems, including S/4HANA, SAP Business Objects, SAP Cloud Platform Integration (CPI), and HANA databases
Perform system upgrades, patching, and kernel updates to ensure system stability and security
Monitor and optimize system performance using SAP HANA Cockpit and other diagnostic tools
Design and implement disaster recovery strategies, including backup, restore, and high-availability solutions
Manage SAP system landscapes, including development, quality assurance, and production environments
Troubleshoot and resolve complex technical issues related to SAP Basis, database, and integration layers
Collaborate with cross-functional teams to support SAP integrations with non-SAP systems via CPI or other middleware
Ensure compliance with security standards, including user access management and role administration
Provide technical expertise for SAP migrations, cloud deployments, and system refreshes
Document system configurations, processes, and procedures for knowledge sharing and audit compliance
Stay updated on SAP technologies and recommend improvements to enhance system performance and scalability
Desired Skills/Experience:
Bachelor's degree in Computer Science, Information Technology, or a related field
7+ years of hands-on SAP Basis administration experience in enterprise environments
3+ years of experience with SAP Business Objects administration
2+ years working with SAP Cloud Platform Integration (CPI) for system integrations
Proven expertise with S/4HANA system administration and HANA database management
Extensive experience with SAP HANA Cockpit for monitoring and administration
Strong proficiency in SQL for database management and performance tuning
Expertise in SAP system patching, upgrades, and kernel maintenance
Knowledge of disaster recovery planning, backup strategies, and high-availability configurations
Familiarity with SAP Solution Manager for system monitoring and change management
Experience with SAP Fiori administration and configuration (preferred)
Understanding of cloud platforms such as: AWS, Azure, Google Cloud for SAP deployments
Knowledge of operating systems such as: Linux/UNIX, Windows Server, supporting SAP environments
Familiarity with SAP security concepts, including user management and authorization
Proactive attitude toward learning and adopting new SAP technologies
Experience with SAP system migrations to cloud or hybrid environments
Knowledge of ITIL processes for change, incident, and problem management
Ability to work on-call or outside regular hours for system maintenance and emergency support
Familiarity with SAP NetWeaver and ABAP stack administration (preferred)
Understanding of data archiving and storage optimization in SAP environments
Benefits:
Medical, Dental, & Vision Insurance Plans
Employee-Owned Profit Sharing (ESOP)
401K offered
The approximate pay range for this position starting at $115,000 - $125,000. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
IAM Engineer
Chanhassen, MN jobs
Identity & Access Management (IAM) Engineer
Our health and fitness client is seeking a strategic and technically proficient IAM Engineer to lead the design, implementation, and evolution of enterprise-wide identity and access management solutions. This role is ideal for a seasoned professional who combines deep technical expertise with architectural vision and thought leadership. You will play a critical role in shaping the organization's IAM strategy, driving innovation, and ensuring robust security practices as the company scales.
Key Responsibilities
Design and implement secure authentication and authorization frameworks using Azure Active Directory, including policies, conditional access, and MFA.
Lead the configuration and optimization of SailPoint IdentityNow/IdentityIQ for identity lifecycle management, access governance, and provisioning.
Oversee the deployment and integration of Ping Identity solutions (PingFederate, PingAccess, PingID) to support SSO, federation, and adaptive access controls.
Provide architectural guidance and hands-on support for SharePoint access controls and IAM integrations.
Collaborate cross-functionally with IT, security, and business stakeholders to define IAM requirements and translate them into scalable, compliant solutions.
Conduct regular audits, monitor IAM systems, and drive remediation efforts to maintain a strong security posture.
Serve as a subject matter expert and internal consultant on IAM technologies, protocols (SAML, OIDC, OAuth), and best practices.
Stay ahead of industry trends and emerging technologies to continuously enhance IAM capabilities and influence strategic direction.
Mentor junior engineers and contribute to the development of IAM standards, documentation, and governance models.
Qualifications
3+ years of progressive experience in Identity & Access Management engineering, architecture, or administration.
Proven expertise in Azure Active Directory, including advanced policy configuration and conditional access.
Experience utilizing CyberArk
Deep hands-on experience with SailPoint (IdentityNow or IdentityIQ) and Ping Identity platforms.
Strong understanding of authentication protocols (SAML, OIDC, OAuth) and federated identity models.
Experience designing and implementing IAM solutions in cloud-first or hybrid environments.
Familiarity with SharePoint access management and integration with IAM tools.
Demonstrated ability to lead complex IAM projects and influence cross-functional teams.
Excellent communication and stakeholder engagement skills, with the ability to translate technical concepts into business value.
Strategic mindset with a proactive, collaborative, and solution-oriented approach.
Additional Qualifications
Exposure to cloud security architectures and broader cybersecurity domains.
Familiarity with cybersecurity frameworks such as NIST, ISO, or CIS.
Experience contributing to IAM governance, policy development, and risk management initiatives.
AirWatch Engineer
Saint Paul, MN jobs
Description of Project:
Seeking one full-time AirWatch MDM Engineer resource to manage and support the existing enterprise mobile device management (MDM) environment. The ideal resource with have deep expertise in AirWatch, a strong understanding of mobile security and compliance, and a passion for modernizing endpoint managment throuch cloud-native solutions.
At a high level, the resource will maintain the current VMware Workspace One (AirWatch) infrastructure while working on and helping with the strategic planning and execution of a future migration to Microsoft Intune alongside the Endpoint Engineering Team.
The projected hourly range for this position is $80 to $100.
On-Demand Group (ODG) provides employee benefits which includes healthcare, dental, and vision insurance. ODG is an equal opportunity employer that does not discriminate on the basis of race, color, religion, gender, sexual orientation, age, national origin, disability, or any other characteristic protected by law.
Mainframe Engineer
Charlotte, NC jobs
• The group is TMS - Transaction Management Services.
• They support a real-time Java application responsible for:
• Money transfers
• ACH processing
• Account balance updates
• This is a critical core banking platform.
• If the application goes down, a Bridge is opened immediately.
Technology Stack
• Built in Java.
• Runs on:
• Mainframe (z/OS)
• zLinux
• Linux
• Application stack includes:
• IBM WebSphere Application Server
• Tomcat libraries
• GitHub (manual processes, not yet cloud-based)
• Messaging & infrastructure:
• IBM MQ Series
• Kafka
• Kubernetes
Cloud & Modernization Efforts
• The platform is not yet on the cloud; several processes remain manual.
• Long-term plan is to migrate to the cloud, and they want talent who can grow with the team through that migration.
• Preference for candidates with:
• Non-mainframe experience
• Modern CI/CD pipeline exposure
• Enthusiasm for AI adoption and leveraging AI to improve workflows.
GCCS Engineer
Colorado Springs, CO jobs
BlueWater Federal is looking for a GCCS Engineer to support the configuration and implementation of the SEWS GCCS-J software and systems, GCCS-J courseware development, providing SEWS specific GCCS-J Operator training to US and FMS personnel, as well as system support to the local Colorado Springs, NATO, and various supported FMS locations.
Responsibilities
Configuration and implementation of the SEWS GCCS-J software and systems, GCCS-J courseware development, providing SEWS specific GCCS-J Operator training to US and FMS personnel, as well as system support to the local Colorado Springs, NATO, and various supported FMS locations.
Perform as a SEWS3 GCCS-J technical expert.
Collaborate with SEWS contractor and government personnel to plan OM&S trips and stay appraised of Theater issues.
Assist with the design and analysis of user needs and associated hardware and software recommendations.
Travel up to 30% in a year to Foreign Partner and CCMD locations.
Perform on-site sustainment including but not limited to system operational check out, system updates, equipment firmware updates and documentation updates.
Perform system support for remote users to identify and resolve hardware, software, and communication issues, document solutions, and develop recommendations to reduce the frequency of repairs.
Respond to system outages to ensure issues are resolved per contract requirements.
Provide maintenance support for system / equipment issues.
Support Emergency On-Site Sustainment (EOSS) travel to customer locations as required.
Plan, develop and conduct GCCS-J Familiarization Training for CCMD and FMS customers.
Provide refresher training for existing staff as well as training for new operators.
Utilize engineering skills to research, develop, test, and document solutions for Research & Test (R&T) activities.
Qualifications
5+ years of experience in systems administration, Tactical Combat Operations, and GCCS.
Active TS clearance with SCI eligibility
GCCS-J Engineer will be required to do local and foreign travel, four or more trips per year
DoD 8570 IAT Level II certification (Security+, CCNA Security, CySA+, GICSP, GSEC, CND, SSCP).
Strong verbal and written communications skills for interaction with senior military and civilian counterparts.
Active US Passport or ability to obtain an Active US Passport
Knowledge of virtualization concepts and products (VMware, Hyper V).
Knowledge of Microsoft Active Directory (AD) for user and groups.
Knowledge of current Microsoft Operating Systems (Server & Workstation).
Familiarity with Oracle/Sybase/Postgres database maintenance.
Familiarity with Java application servers (Tomcat, JBoss).
Familiarity with Linux/UNIX applications and services (NFS, SSH, NTP, LDAP, HTTP, Ansible).
Partner and Allied nation exercise experience is preferred.
RHEL8 experience recommended.
BlueWater Federal is proud to be an Equal Opportunity Employer. All qualified candidates will be considered without regard to race, color, religion, national origin, age, disability, sexual orientation, gender identity, status as a protected veteran, or any other characteristic protected by law. BlueWater Federal is a VEVRAA federal contractor and we request priority referral of veterans.
We offer a competitive health and wellness benefits package, including medical, dental, and vision coverage. Our competitive compensation package includes generous 401k matching, employee stock purchase program, and life insurance options, and time off with pay. Salary range: 110-120K
Theater Engineer
Colorado Springs, CO jobs
BlueWater Federal is looking for a Theater Manager to support the analysis of user needs and develop the design and associated hardware and software recommendations to support the SEWS program
Responsibilities
• Support the analysis of user needs and develop the design and associated hardware and software recommendations to support those needs.
• Collaborate with SEWS contractor and government personnel to plan routine and emergency trips.
• Provide rotating 24/7 on-call Tier 2 system support for remote users, to identify and resolve hardware, software, and communication issues, document solutions, and develop recommendations to reduce the frequency of repairs.
• Respond to system outages to ensure issues are resolved per contract requirements.
• Support foreign partner system and network installation, maintenance, and sustainment.
• Support Emergency On-Site Sustainment (EOSS) travel to customer locations as required.
• Respond to system component failures or change requests and plan system change or restoral implementation.
• Plan, develop and conduct user training for existing staff as well as new CCMD and FMS users.
• Travel up to 50% in a year to Foreign Partner locations.
• Perform planning and execution for a single or multi-team sustainment and training trip.
• Update Technical Data Package as required to document system.
• Perform on-site sustainment including but not limited to system operational check out, inventory, system updates, equipment firmware updates and documentation updates.
Qualifications
3+ years of experience in systems administration, Tactical Combat Operations, and GCCS
• Must have an active Top Secret clearance with SCI Eligibility
• Knowledge of virtualization concepts and products (VMware); Microsoft Active Directory (AD) for user and groups; Microsoft Operating Systems (Server & Workstation)
• Familiarity with Oracle/Sybase/Postgres database maintenance; Java application servers (Tomcat, JBoss)
• Familiarity with Linux/UNIX applications and services (NFS, SSH, NTP, LDAP, HTTP, Ansible)
• DoD 8570 IAT Level II certification (Security+, CCNA Security, CySA+, GICSP, GSEC, CND, SSCP)
• Partner and Allied nation exercise experience is desired
BlueWater Federal Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
We offer a competitive health and wellness benefits package, including medical, dental, and vision coverage. Our competitive compensation package includes generous 401k matching, employee stock purchase program, and life insurance options, and time off with pay. Salary range: 135-145K
Senior Data Engineer
Nashville, TN jobs
Concert is a software and managed services company that promotes health by providing the digital infrastructure for reliable and efficient management of laboratory testing and precision medicine. We are wholeheartedly dedicated to enhancing the transparency and efficiency of health care. Our customers include health plans, provider systems, laboratories, and other important stakeholders. We are a growing organization driven by smart, creative people to help advance precision medicine and health care. Learn more about us at ***************
YOUR ROLE
Concert is seeking a skilled Senior Data Engineer to join our team. Your role will be pivotal in designing, developing, and maintaining our data infrastructure and pipelines, ensuring robust, scalable, and efficient data solutions. You will work closely with data scientists, analysts, and other engineers to support our mission of automating the application of clinical policy and payment through data-driven insights.
You will be joining an innovative, energetic, passionate team who will help you grow and build skills at the intersection of diagnostics, information technology and evidence-based clinical care.
As a Senior Data Engineer you will:
Design, develop, and maintain scalable and efficient data pipelines using AWS services such as Redshift, S3, Lambda, ECS, Step Functions, and Kinesis Data Streams.
Implement and manage data warehousing solutions, primarily with Redshift, and optimize existing data models for performance and scalability.
Utilize DBT (data build tool) for data transformation and modeling, ensuring data quality and consistency.
Develop and maintain ETL/ELT processes to ingest, process, and store large datasets from various sources.
Work with SageMaker for machine learning data preparation and integration.
Ensure data security, privacy, and compliance with industry regulations.
Collaborate with data scientists and analysts to understand data requirements and deliver solutions that meet their needs.
Monitor and troubleshoot data pipelines, identifying and resolving issues promptly.
Implement best practices for data engineering, including code reviews, testing, and automation.
Mentor junior data engineers and share knowledge on data engineering best practices.
Stay up-to-date with the latest advancements in data engineering, AWS services, and related technologies.
After 3 months on the job you will have:
Developed a strong understanding of Concert's data engineering infrastructure
Learned the business domain and how it maps to the information architecture
Made material contributions towards existing key results
After 6 months you will have:
Led a major initiative
Become the first point of contact when issues related to the data warehouse are identified
After 12 months you will have:
Taken responsibility for the long term direction of the data engineering infrastructure
Proposed and executed key results with an understanding of the business strategy
Communicated the business value of major technical initiatives to key non-technical business stakeholders
WHAT LEADS TO SUCCESS
Self-Motivated A team player with a positive attitude and a proactive approach to problem-solving.
Executes Well You are biased to action and get things done. You acknowledge unknowns and recover from setbacks well.
Comfort with Ambiguity You aren't afraid of uncertainty and blazing new trails, you care about building towards a future that is different from today.
Technical Bravery You are comfortable with new technologies and eager to dive in to understand data in the raw and in its processed states.
Mission-focused You are personally motivated to drive more affordable, equitable and effective integration of genomic technologies into clinical care.
Effective Communication You build rapport and great working relationships with senior leaders, peers, and use the relationships you've built to drive the company forward
RELEVANT SKILLS & EXPERIENCE
Minimum of 4 years experience working as a data engineer
Bachelor's degree in software or data engineering or comparable technical certification / experience
Ability to effectively communicate complex technical concepts to both technical and non-technical audiences.
Proven experience in designing and implementing data solutions on AWS, including Redshift, S3, Lambda, ECS, and Step Functions
Strong understanding of data warehousing principles and best practices
Experience with DBT for data transformation and modeling.
Proficiency in SQL and at least one programming language (e.g., Python, Scala)
Familiarity or experience with the following tools / concepts are a plus: BI tools such as Metabase; Healthcare claims data, security requirements, and HIPAA compliance; Kimball's dimensional modeling techniques; ZeroETL and Kinesis data streams
COMPENSATION
Concert is seeking top talent and offers competitive compensation based on skills and experience. Compensation will commensurate with experience. This position will report to the VP of Engineering.
LOCATION
Concert is based in Nashville, Tennessee and supports a remote work environment.
For further questions, please contact: ******************.
Senior Data Engineer
Charlotte, NC jobs
**NO 3rd Party vendor candidates or sponsorship**
Role Title: Senior Data Engineer
Client: Global construction and development company
Employment Type: Contract
Duration: 1 year
Preferred Location: Remote based in ET or CT time zones
Role Description:
The Senior Data Engineer will play a pivotal role in designing, architecting, and optimizing cloud-native data integration and Lakehouse solutions on Azure, with a strong emphasis on Microsoft Fabric adoption, PySpark/Spark-based transformations, and orchestrated pipelines. This role will lead end-to-end data engineering-from ingestion through APIs and Azure services to curated Lakehouse/warehouse layers-while ensuring scalable, secure, well-governed, and well-documented data products. The ideal candidate is hands-on in delivery and also brings data architecture knowledge to help shape patterns, standards, and solution designs.
Key Responsibilities
Design and implement end-to-end data pipelines and ELT/ETL workflows using Azure Data Factory (ADF), Synapse, and Microsoft Fabric.
Build and optimize PySpark/Spark transformations for large-scale processing, applying best practices for performance tuning (partitioning, joins, file sizing, incremental loads).
Develop and maintain API-heavy ingestion patterns, including REST/SOAP integrations, authentication/authorization handling, throttling, retries, and robust error handling.
Architect scalable ingestion, transformation, and serving solutions using Azure Data Lake / OneLake, Lakehouse patterns (Bronze/Silver/Gold), and data warehouse modeling practices.
Implement monitoring, logging, alerting, and operational runbooks for production pipelines; support incident triage and root-cause analysis.
Apply governance and security practices across the lifecycle, including access controls, data quality checks, lineage, and compliance requirements.
Write complex SQL, develop data models, and enable downstream consumption through analytics tools and curated datasets.
Drive engineering standards: reusable patterns, code reviews, documentation, source control, and CI/CD practices.
Requirements:
Bachelor's degree (or equivalent experience) in Computer Science, Engineering, or a related field.
5+ years of experience in data engineering with strong focus on Azure Cloud.
Strong experience with Azure Data Factory pipelines, orchestration patterns, parameterization, and production support.
Strong hands-on experience with Synapse (pipelines, SQL pools and/or Spark), and modern cloud data platform patterns.
Advanced PySpark/Spark experience for complex transformations and performance optimization.
Heavy experience with API-based integrations (building ingestion frameworks, handling auth, pagination, retries, rate limits, and resiliency).
Strong knowledge of SQL and data warehousing concepts (dimensional modeling, incremental processing, data quality validation).
Strong understanding of cloud data architectures including Data Lake, Lakehouse, and Data Warehouse patterns.
Preferred Skills
Experience with Microsoft Fabric (Lakehouse/Warehouse/OneLake, Pipelines, Dataflows Gen2, notebooks).
Architecture experience (formal or informal), such as contributing to solution designs, reference architectures, integration standards, and platform governance.
Experience with DevOps/CI-CD for data engineering using Azure DevOps or GitHub (deployment patterns, code promotion, testing).
Experience with Power BI and semantic model considerations for Lakehouse/warehouse-backed reporting.
Familiarity with data catalog/governance tooling (e.g., Microsoft Purview).
Data Engineer
Denver, CO jobs
Data Engineer
Compensation: $ 80 - 90 /hour, depending on experience
Inceed has partnered with a great energy company to help find a skilled Data Engineer to join their team!
Join a dynamic team where you'll be at the forefront of data-driven operations. This role offers the autonomy to design and implement groundbreaking data architectures, working primarily remotely. This position is open due to exciting new projects. You'll be collaborating with data scientists and engineers, making impactful contributions to the company's success.
Key Responsibilities & Duties:
Design and deploy scalable data pipelines and architectures
Collaborate with stakeholders to deliver high-impact data solutions
Integrate data from various sources ensuring consistency and reliability
Develop automation workflows and BI solutions
Mentor others and advise on data process best practices
Explore and implement emerging technologies
Required Qualifications & Experience:
8+ years of data engineering experience
Experience with PI
Experience with SCADA
Experience with Palantir
Experience with large oil and gas datasets
Proficiency in Python and SQL
Hands-on experience in cloud environments (Azure, AWS, GCP)
Nice to Have Skills & Experience:
Familiarity with Apache Kafka or Flink
Perks & Benefits:
3 different medical health insurance plans, dental, and vision insurance
Voluntary and Long-term disability insurance
Paid time off, 401k, and holiday pay
Weekly direct deposit or pay card deposit
If you are interested in learning more about the Data Engineer opportunity, please submit your resume for consideration. Our client is unable to provide sponsorship at this time.
We are Inceed, a staffing direct placement firm who believes in the possibility of something better. Our mission is simple: We're here to help every person, whether client, candidate, or employee, find and secure what's better for them.
Inceed is an equal opportunity employer. Inceed prohibits discrimination and harassment of any type and affords equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity, or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
Data Engineer
Denver, CO jobs
Data Engineer
Compensation: $80 - $90/hour, depending on experience
Inceed has partnered with a great company to help find a skilled Data Engineer to join their team!
Join a dynamic team as a contract Data Engineer, where you'll be the backbone of data-driven operations. This role offers the opportunity to work with a modern tech stack in a hybrid on-prem and cloud environment. You'll design and implement innovative solutions to complex challenges, collaborating with data scientists, location intelligence experts, and ML engineers. This exciting opportunity has opened due to a new project initiative and you'll be making a tangible impact.
Key Responsibilities & Duties:
Design and deploy scalable data pipelines and architectures
Collaborate with stakeholders to deliver high-impact data solutions
Integrate data from multiple sources ensuring quality and reliability
Develop automation workflows and BI solutions
Mentor others and contribute to the knowledge base
Explore and implement emerging technologies
Required Qualifications & Experience:
8+ years of experience in data engineering
Experience with large oil and gas datasets
Proficiency in SQL and Python
Hands-on experience in cloud environments (Azure, AWS, or GCP)
Familiarity with Apache Kafka, Apache Flink, or Azure Event Hubs
Nice to Have Skills & Experience:
Experience with Palantir Foundry
Knowledge of query federation platforms
Experience with modern data stack tools like dbt or Airflow
Perks & Benefits:
3 different medical health insurance plans, dental, and vision insurance
Voluntary and Long-term disability insurance
Paid time off, 401k, and holiday pay
Weekly direct deposit or pay card deposit
If you are interested in learning more about the Data Engineer opportunity, please submit your resume for consideration. Our client is unable to provide sponsorship at this time.
We are Inceed, a staffing direct placement firm who believes in the possibility of something better. Our mission is simple: We're here to help every person, whether client, candidate, or employee, find and secure what's better for them.
Inceed is an equal opportunity employer. Inceed prohibits discrimination and harassment of any type and affords equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity, or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
IT Data Engineer
Lakewood, CO jobs
IT Data Engineer
Compensation: $125k-$155k (DOE)
Inceed has partnered with a great company to help find a skilled IT Data Engineer to join their team!
Join a dynamic team where innovation meets opportunity. This role is pivotal in advancing AI and data modernization initiatives, bridging traditional database administration with cutting-edge AI data infrastructure. The team thrives on collaboration and offers a hybrid work schedule.
Key Responsibilities & Duties:
Design and maintain scalable data pipelines.
Develop RAG workflows for AI information access.
Build secure connectors and APIs for data retrieval.
Monitor and optimize data flows for consistency.
Lead database administration and performance tuning.
Manage database upgrades and storage optimization.
Implement database security controls and standards.
Support application integrations and data migrations.
Define and maintain data models and metadata.
Collaborate with teams to ensure compliance requirements.
Required Qualifications & Experience:
Bachelor's degree in Computer Science or related field.
7+ years in database administration or data engineering.
Advanced SQL and data modeling skills.
Experience with AI and analytics data pipelines.
Familiarity with cloud-based data ecosystems.
Hands-on experience with RAG and vectorization.
Proficiency in scripting languages like Python.
Experience leading vendor-to-internal transitions.
Nice to Have Skills & Experience:
Experience integrating enterprise systems into data platforms.
Knowledge of data governance frameworks.
Understanding of semantic data modeling.
Experience with cloud migration of database workloads.
Perks & Benefits:
This opportunity includes a comprehensive and competitive benefits package-details will be shared during later stages of the hiring process.
Other Information:
Hybrid work schedule
This position requires a background check and drug test
If you are interested in learning more about the IT Data Engineer opportunity, please submit your resume for consideration. Our client is unable to provide sponsorship at this time.
We are Inceed, a staffing direct placement firm who believes in the possibility of something better. Our mission is simple: We're here to help every person, whether client, candidate, or employee, find and secure what's better for them.
Inceed is an equal opportunity employer. Inceed prohibits discrimination and harassment of any type and affords equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity, or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
Senior Data Engineer
Austin, TX jobs
We are looking for a seasoned Azure Data Engineer to design, build, and optimize secure, scalable, and high-performance data solutions within the Microsoft Azure ecosystem. This will be a multi-year contract worked FULLY ONSITE in Austin, TX.
The ideal candidate brings deep technical expertise in data architecture, ETL/ELT engineering, data integration, and governance, along with hands-on experience in MDM, API Management, Lakehouse architectures, and data mesh or data hub frameworks. This position combines strategic architectural planning with practical, hands-on implementation, empowering cross-functional teams to leverage data as a key organizational asset.
Key Responsibilities
1. Data Architecture & Strategy
Design and deploy end-to-end Azure data platforms using Azure Data Lake, Azure Synapse Analytics, Azure Databricks, and Azure SQL Database.
Build and implement Lakehouse and medallion (Bronze/Silver/Gold) architectures for scalable and modular data processing.
Define and support data mesh and data hub patterns to promote domain-driven design and federated governance.
Establish standards for conceptual, logical, and physical data modeling across data warehouse and data lake environments.
2. Data Integration & Pipeline Development
Develop and maintain ETL/ELT pipelines using Azure Data Factory, Synapse Pipelines, and Databricks for both batch and streaming workloads.
Integrate diverse data sources (on-prem, cloud, SaaS, APIs) into a unified Azure data environment.
Optimize pipelines for cost-effectiveness, performance, and scalability.
3. Master Data Management (MDM) & Data Governance
Implement MDM solutions using Azure-native or third-party platforms (e.g., Profisee, Informatica, Semarchy).
Define and manage data governance, metadata, and data quality frameworks.
Partner with business teams to align data standards and maintain data integrity across domains.
4. API Management & Integration
Build and manage APIs for data access, transformation, and system integration using Azure API Management and Logic Apps.
Design secure, reliable data services for internal and external consumers.
Automate workflows and system integrations using Azure Functions, Logic Apps, and Power Automate.
5. Database & Platform Administration
Perform core DBA tasks, including performance tuning, query optimization, indexing, and backup/recovery for Azure SQL and Synapse.
Monitor and optimize cost, performance, and scalability across Azure data services.
Implement CI/CD and Infrastructure-as-Code (IaC) solutions using Azure DevOps, Terraform, or Bicep.
6. Collaboration & Leadership
Work closely with data scientists, analysts, business stakeholders, and application teams to deliver high-value data solutions.
Mentor junior engineers and define best practices for coding, data modeling, and solution design.
Contribute to enterprise-wide data strategy and roadmap development.
Required Qualifications
Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related fields.
5+ years of hands-on experience in Azure-based data engineering and architecture.
Strong proficiency with the following:
Azure Data Factory, Azure Synapse, Azure Databricks, Azure Data Lake Storage Gen2
SQL, Python, PySpark, PowerShell
Azure API Management and Logic Apps
Solid understanding of data modeling approaches (3NF, dimensional modeling, Data Vault, star/snowflake schemas).
Proven experience with Lakehouse/medallion architectures and data mesh/data hub designs.
Familiarity with MDM concepts, data governance frameworks, and metadata management.
Experience with automation, data-focused CI/CD, and IaC.
Thorough understanding of Azure security, RBAC, Key Vault, and core networking principles.
What We Offer
Competitive compensation and benefits package
Luna Data Solutions, Inc. (LDS) provides equal employment opportunities to all employees. All applicants will be considered for employment. LDS prohibits discrimination and harassment of any type regarding age, race, color, religion, sexual orientation, gender identity, sex, national origin, genetics, protected veteran status, and/or disability status.
Senior Data Analytics Engineer (Customer Data)
Irving, TX jobs
Our client is seeking a Senior Data Analytics Engineer (Customer Data) to join their team! This position is located in remote.
Build, optimize, and maintain customer data pipelines in PySpark/Databricks to support CDP-driven use cases across AWS/Azure/GCP
Transform raw and integrated customer data into analytics-ready datasets used for dashboards, reporting, segmentation, personalization, and downstream AI/ML applications
Develop and enrich customer behavior metrics, campaign analytics, and performance insights such as: ad engagement, lifecycle metrics, retention
Partner with Marketing, Sales, Product, and Data Science teams to translate business goals into metrics, features, and analytical data models
Build datasets consumed by Power BI/Tableau dashboards (hands-on dashboard creation not required)
Ensure high cluster performance and pipeline optimization in Databricks, including troubleshooting skewed joins, sorting, partitioning, and real-time processing needs
Work across multiple cloud and vendor ecosystems such as: AWS/Azure/GCP; Hightouch or comparable CDP vendors
Participate in the data ingestion and digestion phases, shaping integrated data into analytical layers for MarTech and BI
Contribute to and enforce data engineering standards, documentation, governance, and best practices across the organization
Desired Skills/Experience:
6+ years of experience in Data Engineering, Analytics Engineering, or related fields, including data modeling experience
Strong Data Engineering fundamentals with the ability to design pipelines, optimize performance, and deliver real-time or near-real-time datasets
Ability to deeply understand data, identifying gaps, designing meaningful transformations, and creating metrics with clear business context
Understanding of how customer data moves through Customer Data Platforms (CDPs) and how to design pipelines that integrate with them
Experience supporting Marketing, Customer Data, MarTech, CDP, segmentation, or personalization teams strongly preferred
Hands-on experience required with: Databricks, PySpark, Python, SQL, Building analytics datasets for dashboards/reporting and customer behavior analytics or campaign performance insights
Experience designing and implementing features that feed downstream AI or customer-facing applications
Benefits:
Medical, Dental, & Vision Insurance Plans
Employee-Owned Profit Sharing (ESOP)
401K offered
The approximate pay range for this position starting at $150-160,000+. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
Data Engineer
Austin, TX jobs
About the Role
We are seeking a highly skilled Databricks Data Engineer with strong expertise in modern data engineering, Azure cloud technologies, and Lakehouse architectures. This role is ideal for someone who thrives in dynamic environments, enjoys solving complex data challenges, and can lead end-to-end delivery of scalable data solutions.
What We're Looking For
8+ years designing and delivering scalable data pipelines in modern data platforms
Deep experience in data engineering, data warehousing, and enterprise-grade solution delivery
Ability to lead cross-functional initiatives in matrixed teams
Advanced skills in SQL, Python, and ETL/ELT development, including performance tuning
Hands-on experience with Azure, Snowflake, and Databricks, including system integrations
Key Responsibilities
Design, build, and optimize large-scale data pipelines on the Databricks Lakehouse platform
Modernize and enhance cloud-based data ecosystems on Azure, contributing to architecture, modeling, security, and CI/CD
Use Apache Airflow and similar tools for workflow automation and orchestration
Work with financial or regulated datasets while ensuring strong compliance and governance
Drive best practices in data quality, lineage, cataloging, and metadata management
Primary Technical Skills
Develop and optimize ETL/ELT pipelines using Python, PySpark, Spark SQL, and Databricks Notebooks
Design efficient Delta Lake models for reliability and performance
Implement and manage Unity Catalog for governance, RBAC, lineage, and secure data sharing
Build reusable frameworks using Databricks Workflows, Repos, and Delta Live Tables
Create scalable ingestion pipelines for APIs, databases, files, streaming sources, and MDM systems
Automate ingestion and workflows using Python and REST APIs
Support downstream analytics for BI, data science, and application workloads
Write optimized SQL/T-SQL queries, stored procedures, and curated datasets
Automate DevOps workflows, testing pipelines, and workspace configurations
Additional Skills
Azure: Data Factory, Data Lake, Key Vault, Logic Apps, Functions
CI/CD: Azure DevOps
Orchestration: Apache Airflow (plus)
Streaming: Delta Live Tables
MDM: Profisee (nice-to-have)
Databases: SQL Server, Cosmos DB
Soft Skills
Strong analytical and problem-solving mindset
Excellent communication and cross-team collaboration
Detail-oriented with a high sense of ownership and accountability
Azure Data Engineer
Irving, TX jobs
Our client is seeking an Azure Data Engineer to join their team! This position is located in Irving, Texas. THIS ROLE REQUIRES AN ONSITE INTERVIEW IN IRVING, please only apply if you are local and available to interview onsite.
Duties:
Lead the design, architecture, and implementation of key data initiatives and platform capabilities
Optimize existing data workflows and systems to improve performance, cost-efficiency, identifying and guiding teams to implement solutions
Lead and mentor a team of 2-5 data engineers, providing guidance on technical best practices, career development, and initiative execution
Contribute to the development of data engineering standards, processes, and documentation, promoting consistency and maintainability across teams while enabling business stakeholders
Desired Skills/Experience:
Bachelor's degree or equivalent in Computer Science, Mathematics, Software Engineering, Management Information Systems, etc.
5+ years of relevant work experience in data engineering
Strong technical skills in SQL, PySpark/Python, Azure, and Databricks
Deep understanding of data engineering fundamentals, including database architecture and design, ETL, etc.
Benefits:
Medical, Dental, & Vision Insurance Plans
Employee-Owned Profit Sharing (ESOP)
401K offered
The approximate pay range for this position starting at $140-145,000+. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
Ruby on Rails Staff Engineer
Tampa, FL jobs
About Us:
We are working with a mission-driven SaaS company dedicated to keeping people safe. They're passionate about public safety and strive to create innovative solutions that keep their customer's happy.
Job Description:
As a Senior Staff Engineer, you will play a crucial role in building and maintaining our cutting-edge web applications. You will work closely with our engineering team to design, develop, and deploy robust, scalable, and user-friendly features.
Onsite in Tampa, FL Area
Up to 220k base salary
Responsibilities:
Design, develop, and maintain backend applications using Ruby on Rails.
Build user-friendly and responsive frontend interfaces using React.
Collaborate with cross-functional teams to define and implement new features.
Write clean, well-tested, and efficient code.
Qualifications:
Strong proficiency in Ruby on Rails, React, and React Native.
Experience with relational databases (e.g., PostgreSQL).
Solid understanding of JavaScript and web development fundamentals.
Familiarity with RESTful APIs and microservices architecture.