Sr. Software Development Engineer, Annapurna Labs
Data engineer job in Austin, TX
In this role you will be responsible for leading a technical team which is critical in providing compute sanitization to the Neuron ML accelerators fleet. You will work closely with the hardware and software teams to ensure the right tools are available for identifying defects or faulty states of the hardware before the customer hits an issue. Neuron Compute Sanitizer Tools develops and maintains a pre-check and functional correctness checking suite and provides visibility at the fleet level to understand the trends of hardware/software sanitization.
Key job responsibilities
* Provide technical leadership to the Compute Sanitization team
* Work closely with the hardware and firmware design teams.
* Collect requirements from various other teams including training, inference and runtime.
* Collaborate with the runtime team to ensure timely release of the pre-check tools.
* Anticipate future needs based on the product roadmap and develop necessary tools to sanitize compute.
About the team
Our team is dedicated to supporting new members. We have a broad mix of experience levels and tenures, and we're building an environment that celebrates knowledge-sharing and mentorship. Our senior members enjoy one-on-one mentoring and thorough, but kind, code reviews. We care about your career growth and strive to assign projects that help our team members develop your engineering expertise so you feel empowered to take on more complex tasks in the future.
Diverse Experiences
AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn't followed a traditional path, or includes alternative experiences, don't let it stop you from applying.
Inclusive Team Culture
Here at AWS, it's in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon conferences, inspire us to never stop embracing our uniqueness.
Work/Life Balance
We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there's nothing we can't achieve in the cloud.
Mentorship & Career Growth
We're continuously raising our performance bar as we strive to become Earth's Best Employer. That's why you'll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional.
About Amazon Annapurna Labs:
Amazon Annapurna Labs team (our organization within AWS UC) is responsible for building innovation in silicon and software for our AWS customers. We are at the forefront of innovation by combining cloud scale with the world's most talented engineers. Our team covers multiple disciplines including silicon engineering, hardware design, software and operations. Because of our teams breadth of talent, we have been able to improve AWS cloud infrastructure in high-performance machine learning with AWS Neuron, Inferentia and Trainium ML chips, in networking and security with products such as AWS Nitro, Enhanced Network Adapter (ENA), and Elastic Fabric Adapter (EFA), and in computing with AWS Graviton and F1 EC2 instances.
About AWS Utility Computing (UC):
AWS Utility Computing (UC) provides product innovations that continue to set AWS's services and features apart in the industry. As a member of the UC organization, you'll support the development and management of Compute, Database, Storage, Platform, and Productivity Apps services in AWS, including support for customers who require specialized security solutions for their cloud services. Additionally, this role may involve exposure to and experience with Amazon's growing suite of generative AI services and other cloud computing offerings across the AWS portfolio.
About AWS
Amazon Web Services (AWS) is the world's most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating - that's why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses.
About AWS Neuron:
AWS Neuron is the software of Trainium and Inferentia, the AWS Machine Learning chips. Inferentia delivers best-in-class ML inference performance at the lowest cost in the cloud to our AWS customers. Trainium is designed to deliver the best-in-class ML training performance at the lowest training cost in the cloud, and it's all being enabled by AWS Neuron. Neuron is a Software that include ML compiler and native integration into popular ML frameworks. Our products are being used at scale with external customers like Anthropic and Databricks as well as internal customers like Alexa, Amazon Bedrocks, Amazon Robotics, Amazon Ads, Amazon Rekognition and many more.
BASIC QUALIFICATIONS- 10+ years of engineering experience
- 10+ years of planning, designing, developing and delivering consumer software experience
- Experience partnering with product or program management teams
- Experience as a tech lead of a large group of engineers
PREFERRED QUALIFICATIONS- Experience designing and developing large scale, high-traffic applications
- Experience with ML hardware/Software
Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status.
Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit ********************************************************* for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
Our compensation reflects the cost of labor across several US geographic markets. The base pay for this position ranges from $151,300/year in our lowest geographic market up to $261,500/year in our highest geographic market. Pay is based on a number of factors including market location and may vary depending on job-related knowledge, skills, and experience. Amazon is a total compensation company. Dependent on the position offered, equity, sign-on payments, and other forms of compensation may be provided as part of a total compensation package, in addition to a full range of medical, financial, and/or other benefits. For more information, please visit ******************************************************** This position will remain posted until filled. Applicants should apply via our internal or external career site.
Mobile Engineering
Data engineer job in Austin, TX
JLL empowers you to shape a brighter way.
Our people at JLL and JLL Technologies are shaping the future of real estate for a better world by combining world class services, advisory and technology for our clients. We are committed to hiring the best, most talented people and empowering them to thrive, grow meaningful careers and to find a place where they belong. Whether you've got deep experience in commercial real estate, skilled trades or technology, or you're looking to apply your relevant experience to a new industry, join our team as we help shape a brighter way forward.
Mobile Engineering - JLL
What this job involves: This position focuses on the hands-on performance of ongoing preventive maintenance and repair work orders across multiple facility locations. You will maintain, operate, and repair building systems including HVAC, electrical, plumbing, and other critical infrastructure components. This mobile role requires you to travel between assigned buildings, conduct facility inspections, respond to emergencies, and ensure all systems operate efficiently to support client occupancy and satisfaction across JLL's building portfolio.
What your day-to-day will look like:
• Perform ongoing preventive maintenance and repair work orders on facility mechanical, electrical and other installed systems, equipment, and components.
• Maintain, operate, and repair all HVAC systems and associated equipment, electrical distribution equipment, plumbing systems, building interior/exterior repair, and related grounds.
• Conduct assigned facility inspections and due diligence efforts, reporting conditions that impact client occupancy and operations.
• Respond effectively to all emergencies and after-hours building activities as required.
• Prepare and submit summary reports to management listing conditions found during assigned work and recommend corrective actions.
• Study and maintain familiarity with building automation systems, fire/life safety systems, and other building-related equipment.
• Maintain compliance with all safety procedures, recognize hazards, and propose elimination methods while adhering to State, County, or City Ordinances, Codes, and Laws.
Required Qualifications:
• Valid state driver's license and Universal CFC Certification.
• Minimum four years of technical experience in all aspects of building engineering with strong background in packaged and split HVAC units, plumbing, and electrical systems.
• Physical ability to lift up to 80 lbs and climb ladders up to 30 ft.
• Ability to read schematics and technical drawings.
• Availability for on-call duties and overtime as required.
• Must pass background, drug/alcohol, and MVR screening process.
Preferred Qualifications:
• Experience with building automation systems and fire/life safety systems.
• Knowledge of CMMS systems such as Corrigo for work order management.
• Strong troubleshooting and problem-solving abilities across multiple building systems.
• Experience working in commercial building environments.
• Commitment to ongoing safety training and professional development.
Location: Mobile position covering Austin, TX and surrounding area.
Work Shift: Standard business hours with on-call availability
#HVACjobs
This position does not provide visa sponsorship. Candidates must be authorized to work in the United States without employer sponsorship.
Location:
On-site -Austin, TX
If this job description resonates with you, we encourage you to apply, even if you don't meet all the requirements. We're interested in getting to know you and what you bring to the table!
Personalized benefits that support personal well-being and growth:
JLL recognizes the impact that the workplace can have on your wellness, so we offer a supportive culture and comprehensive benefits package that prioritizes mental, physical and emotional health. Some of these benefits may include:
401(k) plan with matching company contributions
Comprehensive Medical, Dental & Vision Care
Paid parental leave at 100% of salary
Paid Time Off and Company Holidays
Early access to earned wages through Daily Pay
JLL Privacy Notice
Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLL's recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely.
For more information about how JLL processes your personal data, please view our Candidate Privacy Statement.
For additional details please see our career site pages for each country.
For candidates in the United States, please see a full copy of our Equal Employment Opportunity policy here.
Jones Lang LaSalle (“JLL”) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process - including the online application and/or overall selection process - you may email us at ******************. This email is only to request an accommodation. Please direct any other general recruiting inquiries to our Contact Us page > I want to work for JLL.
Accepting applications on an ongoing basis until candidate identified.
Sr Data Engineer, Test Automation, AI/ML Systems
Data engineer job in Austin, TX
Primary Responsibilities:
Design and implement automated test suites for AI/ML workflows that enables Optum to derive information from patient data in a scalable, reliable, and cost-effective manner using AI
Analyze clinical data and determine the best designs on how to verify the correct processing of this data
Perform functional, regression, integration, and system testing to validate software quality
Identify, create, and track reproducible defects using bug-tracking tools, collaborating with developers for resolution
Conduct performance and load testing to identify bottlenecks and measure throughput
Integrate automated tests to our AWS and GCP accounts to achieve continuous integration (CI/CD)
As an advocate for product quality, ensure that team members adopt agile SCRUM methodologies, perform unit tests, code reviews
Review and contribute to engineering specifications to ensure that each feature/user story can be tested in an automated fashion
Write integration test plans with end-end test automation as a goal
Mentoring a couple of mode Junior engineers who may write test cases
Design, develop, and deploy AI-powered solutions using no-code, low-code, and advanced platforms, translating business needs into scalable applications that enhance products, workflows and decision-making
You'll be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role as well as provide development for other roles you may be interested in.
Required Qualifications:
Master's degree in computer science, Engineering, or a related technical field
4+ years of work experience in AI/ML engineering with solid proficiency in Python programming
2+ years of experience as an SDET using test automation tools like Selenium, Cypress or similar tools
Experience in at least one project using API testing tools like Selenium, Pytest, Postman or similar
Delivered at least one product hosted on a cloud platform (AWS, GCP, or Azure)
Preferred Qualifications:
AWS experience such as RDS, ALB, Redis Cache, Secret Manager, EC2, IAM
Experience in deploying generative AI systems (LLMs, Prompts, Agentic systems)
Experience with Rally
Experience integrating with healthcare data systems and working with clinical data formats
Solid communication and collaboration skills, with the ability to thrive in fast-paced and ambiguous environments
Solid analytical skills and ability to debug issues in complex systems
Experience with CI/CD pipelines and infrastructure-as-code tools (e.g., Terraform, CloudFormation, GitHub actions, Azure DevOps)
*All employees working remotely will be required to adhere to UnitedHealth Group's Telecommuter Policy.
Pay is based on several factors including but not limited to local labor markets, education, work experience, certifications, etc. In addition to your salary, we offer benefits such as, a comprehensive benefits package, incentive and recognition programs, equity stock purchase and 401k contribution (all benefits are subject to eligibility requirements). No matter where or when you begin a career with us, you'll find a far-reaching choice of benefits and incentives. The salary for this role will range from $89,900 to $160,600 annually based on full-time employment. We comply with all minimum wage laws as applicable.
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
UnitedHealth Group is an Equal Employment Opportunity employer under applicable law and qualified applicants will receive consideration for employment without regard to race, national origin, religion, age, color, sex, sexual orientation, gender identity, disability, or protected veteran status, or any other characteristic protected by local, state, or federal laws, rules, or regulations.
UnitedHealth Group is a drug - free workplace. Candidates are required to pass a drug test before beginning employment.
Applied Data Scientist/ Data Science Engineer
Data engineer job in Austin, TX
Role: Applied Data Scientist/ Data Science Engineer
Yrs. of experience: 8+ Yrs.
Job type : Fulltime
Job Responsibilities:
You will be part of a team that innovates and collaborates with internal stakeholders to deliver world-class solutions with a customer first mentality. This group is passionate about the data science field and is motivated to find opportunity in, and develop solutions for, evolving challenges.
You will:
Solve business and customer issues utilizing AI/ML - Mandatory
Build prototypes and scalable AI/ML solutions that will be integrated into software products
Collaborate with software engineers, business stakeholders and product owners in an Agile environment
Have complete ownership of model outcomes and drive continuous improvement
Essential Requirements:
Strong coding skills in Python and SQL - Mandatory
Machine Learning knowledge (Deep learning, Information Retrieval (RAG), GenAI , Classification, Forecasting, Regression, etc. on large datasets) with experience in ML model deployment
Ability to work with internal stakeholders to transfer business questions into quantitative problem statements
Ability to effectively communicate data science progress to non-technical internal stakeholders
Ability to lead a team of data scientists is a plus
Experience with Big Data technologies and/or software development is a plus
Senior Data Engineer
Data engineer job in Austin, TX
We are looking for a seasoned Azure Data Engineer to design, build, and optimize secure, scalable, and high-performance data solutions within the Microsoft Azure ecosystem. This will be a multi-year contract worked FULLY ONSITE in Austin, TX.
The ideal candidate brings deep technical expertise in data architecture, ETL/ELT engineering, data integration, and governance, along with hands-on experience in MDM, API Management, Lakehouse architectures, and data mesh or data hub frameworks. This position combines strategic architectural planning with practical, hands-on implementation, empowering cross-functional teams to leverage data as a key organizational asset.
Key Responsibilities
1. Data Architecture & Strategy
Design and deploy end-to-end Azure data platforms using Azure Data Lake, Azure Synapse Analytics, Azure Databricks, and Azure SQL Database.
Build and implement Lakehouse and medallion (Bronze/Silver/Gold) architectures for scalable and modular data processing.
Define and support data mesh and data hub patterns to promote domain-driven design and federated governance.
Establish standards for conceptual, logical, and physical data modeling across data warehouse and data lake environments.
2. Data Integration & Pipeline Development
Develop and maintain ETL/ELT pipelines using Azure Data Factory, Synapse Pipelines, and Databricks for both batch and streaming workloads.
Integrate diverse data sources (on-prem, cloud, SaaS, APIs) into a unified Azure data environment.
Optimize pipelines for cost-effectiveness, performance, and scalability.
3. Master Data Management (MDM) & Data Governance
Implement MDM solutions using Azure-native or third-party platforms (e.g., Profisee, Informatica, Semarchy).
Define and manage data governance, metadata, and data quality frameworks.
Partner with business teams to align data standards and maintain data integrity across domains.
4. API Management & Integration
Build and manage APIs for data access, transformation, and system integration using Azure API Management and Logic Apps.
Design secure, reliable data services for internal and external consumers.
Automate workflows and system integrations using Azure Functions, Logic Apps, and Power Automate.
5. Database & Platform Administration
Perform core DBA tasks, including performance tuning, query optimization, indexing, and backup/recovery for Azure SQL and Synapse.
Monitor and optimize cost, performance, and scalability across Azure data services.
Implement CI/CD and Infrastructure-as-Code (IaC) solutions using Azure DevOps, Terraform, or Bicep.
6. Collaboration & Leadership
Work closely with data scientists, analysts, business stakeholders, and application teams to deliver high-value data solutions.
Mentor junior engineers and define best practices for coding, data modeling, and solution design.
Contribute to enterprise-wide data strategy and roadmap development.
Required Qualifications
Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related fields.
5+ years of hands-on experience in Azure-based data engineering and architecture.
Strong proficiency with the following:
Azure Data Factory, Azure Synapse, Azure Databricks, Azure Data Lake Storage Gen2
SQL, Python, PySpark, PowerShell
Azure API Management and Logic Apps
Solid understanding of data modeling approaches (3NF, dimensional modeling, Data Vault, star/snowflake schemas).
Proven experience with Lakehouse/medallion architectures and data mesh/data hub designs.
Familiarity with MDM concepts, data governance frameworks, and metadata management.
Experience with automation, data-focused CI/CD, and IaC.
Thorough understanding of Azure security, RBAC, Key Vault, and core networking principles.
What We Offer
Competitive compensation and benefits package
Luna Data Solutions, Inc. (LDS) provides equal employment opportunities to all employees. All applicants will be considered for employment. LDS prohibits discrimination and harassment of any type regarding age, race, color, religion, sexual orientation, gender identity, sex, national origin, genetics, protected veteran status, and/or disability status.
Data Engineer III
Data engineer job in Austin, TX
Data Engineer III Duration: Contract We are seeking a highly skilled and experienced Data Engineer III to join our team in Austin, Texas. The ideal candidate will be responsible for designing, developing, and maintaining data pipelines and systems to support our organization's data needs. This role requires a deep understanding of data engineering principles, strong problem-solving skills, and the ability to work collaboratively in a fast-paced environment.
Responsibilities:
Design, develop, and maintain scalable data pipelines and systems.
Collaborate with cross-functional teams to understand data requirements and deliver solutions.
Optimize and improve data workflows for efficiency and reliability.
Ensure data quality and integrity through robust testing and validation processes.
Monitor and troubleshoot data systems to ensure smooth operations.
Stay updated with the latest trends and technologies in data engineering.
Qualifications:
Bachelor's degree in Computer Science, Engineering, or a related field.
Proven experience as a Data Engineer or in a similar role.
Strong proficiency in programming languages such as Python, Java, or Scala.
Experience with big data technologies like Hadoop, Spark, or Kafka.
Proficiency in SQL and database management systems.
Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.
Excellent problem-solving and analytical skills.
Strong communication and teamwork abilities.
About PTR Global: PTR Global is a leading provider of information technology and workforce solutions. PTR Global has become one of the largest providers in its industry, with over 5000 professionals providing services across the U.S. and Canada. For more information visit *****************
At PTR Global, we understand the importance of your privacy and security. We NEVER ASK job applicants to:
Pay any fee to be considered for, submitted to, or selected for any opportunity.
Purchase any product, service, or gift cards from us or for us as part of an application, interview, or selection process.
Provide sensitive financial information such as credit card numbers or banking information. Successfully placed or hired candidates would only be asked for banking details after accepting an offer from us during our official onboarding processes as part of payroll setup.
Pay Range: $70 - $75
The specific compensation for this position will be determined by a number of factors, including the scope, complexity and location of the role as well as the cost of labor in the market; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. Our full-time consultants have access to benefits including medical, dental, vision and 401K contributions as well as any other PTO, sick leave, and other benefits mandated by appliable state or localities where you reside or work.
If you receive a suspicious message, email, or phone call claiming to be from PTR Global do not respond or click on any links. Instead, contact us directly at ***************. To report any concerns, please email us at *******************
Junior Data Reporting Engineer
Data engineer job in Austin, TX
Title: Junior Data Reporting Engineer
MUST
Business Intelligence
SQL
Dashboard Development (Tableau, BO, OBI, Power BI)
Python
Job Description:
Implement and maintain data analysis scripts using SQL and Python.
Develop and support reports and dashboards using Plx Data Studio Looker.
Improve and develop new and existing dashboards supporting business growth.
Monitor performance and implement the necessary infrastructure optimization.
Demonstrate the ability and willingness to learn quickly and complete large volumes of work with high quality.
Demonstrate excellent collaboration, interpersonal communication, and written skills with the ability to work in a team environment.
Minimum Qualifications:
2+ years of solid hands-on experience with complex SQL scripting and Dashboard development.
Hands-on experience with design, development, and support of data analysis.
Experience with data platform and visualization technologies such as dashboards, Data Studio, Looker, SQL, and BigQuery.
Strong design and development skills with meticulous attention to detail.
Familiarity with Agile Software Development practices and working in an agile environment.
Strong analytical, troubleshooting, and organizational skills.
Ability to analyze and troubleshoot complex issues, and proficiency in multitasking.
BS degree in Computer Science, Math, Statistics, or equivalent academic credentials.
Data Engineer
Data engineer job in Austin, TX
About the Role
We are seeking a highly skilled Databricks Data Engineer with strong expertise in modern data engineering, Azure cloud technologies, and Lakehouse architectures. This role is ideal for someone who thrives in dynamic environments, enjoys solving complex data challenges, and can lead end-to-end delivery of scalable data solutions.
What We're Looking For
8+ years designing and delivering scalable data pipelines in modern data platforms
Deep experience in data engineering, data warehousing, and enterprise-grade solution delivery
Ability to lead cross-functional initiatives in matrixed teams
Advanced skills in SQL, Python, and ETL/ELT development, including performance tuning
Hands-on experience with Azure, Snowflake, and Databricks, including system integrations
Key Responsibilities
Design, build, and optimize large-scale data pipelines on the Databricks Lakehouse platform
Modernize and enhance cloud-based data ecosystems on Azure, contributing to architecture, modeling, security, and CI/CD
Use Apache Airflow and similar tools for workflow automation and orchestration
Work with financial or regulated datasets while ensuring strong compliance and governance
Drive best practices in data quality, lineage, cataloging, and metadata management
Primary Technical Skills
Develop and optimize ETL/ELT pipelines using Python, PySpark, Spark SQL, and Databricks Notebooks
Design efficient Delta Lake models for reliability and performance
Implement and manage Unity Catalog for governance, RBAC, lineage, and secure data sharing
Build reusable frameworks using Databricks Workflows, Repos, and Delta Live Tables
Create scalable ingestion pipelines for APIs, databases, files, streaming sources, and MDM systems
Automate ingestion and workflows using Python and REST APIs
Support downstream analytics for BI, data science, and application workloads
Write optimized SQL/T-SQL queries, stored procedures, and curated datasets
Automate DevOps workflows, testing pipelines, and workspace configurations
Additional Skills
Azure: Data Factory, Data Lake, Key Vault, Logic Apps, Functions
CI/CD: Azure DevOps
Orchestration: Apache Airflow (plus)
Streaming: Delta Live Tables
MDM: Profisee (nice-to-have)
Databases: SQL Server, Cosmos DB
Soft Skills
Strong analytical and problem-solving mindset
Excellent communication and cross-team collaboration
Detail-oriented with a high sense of ownership and accountability
SAP Data Architect
Data engineer job in Austin, TX
This position is open for both full-time and contract and requires Day 1 onsite position. Candidates must have above 13+ years of IT experience.
Expectations / Deliverables for the Role
Builds the SAP data foundation by defining how SAP systems store, share, and manage trusted enterprise data.
Produces reference data architectures by leveraging expert input from application, analytics, integration, platform, and security teams. These architectures form the basis for new solutions and enterprise data initiatives.
Enables analytics and AI use cases by ensuring data is consistent, governed, and discoverable.
Leverages SAP Business Data Cloud, Datasphere, MDG and related capabilities to unify data and eliminate duplicate data copies.
Defines and maintains common data model catalogs to create a shared understanding of core business data.
Evolves data governance, ownership, metadata, and lineage standards across the enterprise.
Protects core transactional systems by preventing excessive replication and extraction loads.
Technical Proficiency
Strong knowledge of SAP master and transactional data domains.
Hands-on experience with SAP MDG, Business Data Cloud, BW, Datasphere, or similar platforms.
Expertise in data modeling, metadata management, data quality, and data governance practices.
Understanding of data architectures that support analytics, AI, and regulatory requirements.
Experience integrating SAP data with non-SAP analytics and reporting platforms.
Soft Skills
Ability to align data, and engineering teams around a shared data vision, drive consensus on data standards and decisions
Strong facilitation skills to resolve data ownership and definition conflicts.
Clear communicator who can explain architecture choices, trade-offs, and cost impacts to stakeholders.
Pragmatic mindset focused on value, reuse, and simplification.
Comfortable challenging designs constructively in ARB reviews
Senior Software Engineer, Server Control Firmware
Data engineer job in Austin, TX
Annapurna Labs designs silicon and software that accelerates innovation. Customers choose us to create cloud solutions that solve challenges that were unimaginable a short time ago-even yesterday. Our custom chips, accelerators, and software stacks enable us to take on technical challenges that have never been seen before, and deliver results that help our customers change the world.
In Annapurna Labs we are at the forefront of hardware/software co-design not just in Amazon Web Services (AWS) but across the industry. Our Chassis Software team is looking for candidates interested in diving deep into the different hardware technologies that power our Machine Learning servers and develop the software and firmware to drive, support and sustain these technologies as they evolve though concept and manufacturing, and finally take their place in our rapidly expanding fleet of cutting edge Machine Learning products our customers demand.
Key job responsibilities
- Provide Baseboard Management Controller (BMC) and Satellite Management Controller (SMC) software and firmware for Machine Learning Accelerator (MLA) servers.
- Continuously collaborate with other server and board software teams responsible for accelerator management firmware and other programmable logic devices.
- Work within the larger MLA Systems Software group to support development of mission-mode firmware, exercisers for manufacturing and vetting, and automation for qualification and deployment.
- Engage in new product development by participating in early concept design reviews, schematic approvals, offsite board bringup and laboratory-based testing.
A day in the life
The MLA Chassis Software team was formed to focus on board firmware primarily for mission-mode control of sensors and other board-level hardware. This includes debug, testing, qualification, and manufacturing. We touch technologies from device drivers to the I2C infrastructure pervasive in the server and everything in between. We are not working on machine learning algorithms, but rather we work on the physical systems (hardware) which execute and accelerate those machine learning algorithms. Data paths, I2C, and device control are our bread and butter. Some of us know what a Tensor is but really it's not what we do.
About the team
Our team is dedicated to supporting new members. We have a broad mix of experience levels and tenures, and we're building an environment that celebrates knowledge-sharing and mentorship. Our senior members enjoy one-on-one mentoring and thorough, but kind, code reviews. We care about your career growth and strive to assign projects that help our team members develop your engineering expertise so you feel empowered to take on more complex tasks in the future.
Diverse Experiences
AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn't followed a traditional path, or includes alternative experiences, don't let it stop you from applying.
About AWS
Amazon Web Services (AWS) is the world's most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating - that's why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses.
Inclusive Team Culture
Here at AWS, it's in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness.
Work/Life Balance
We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there's nothing we can't achieve in the cloud.
Mentorship & Career Growth
We're continuously raising our performance bar as we strive to become Earth's Best Employer. That's why you'll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional.
BASIC QUALIFICATIONS- 5+ years of non-internship professional software development experience
- 5+ years of programming with at least one software programming language experience
- 5+ years of leading design or architecture (design patterns, reliability and scaling) of new and existing systems experience
- 5+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience
- Experience as a mentor, tech lead or leading an engineering team
PREFERRED QUALIFICATIONS- Bachelor's degree in computer science or equivalent
- Experience writing software for DDR/HBM controllers and PHYs
Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status.
Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit ********************************************************* for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
Our compensation reflects the cost of labor across several US geographic markets. The base pay for this position ranges from $151,300/year in our lowest geographic market up to $261,500/year in our highest geographic market. Pay is based on a number of factors including market location and may vary depending on job-related knowledge, skills, and experience. Amazon is a total compensation company. Dependent on the position offered, equity, sign-on payments, and other forms of compensation may be provided as part of a total compensation package, in addition to a full range of medical, financial, and/or other benefits. For more information, please visit ******************************************************** This position will remain posted until filled. Applicants should apply via our internal or external career site.
Data Architect with Snowflake
Data engineer job in Austin, TX
We are seeking a Solutions Cloud Architect to join a team responsible for the technical deployment of Snowflake solutions for customers. The architect will work directly with customers and system integrators to design, deploy, and optimize Snowflake environments. This role requires a self-starter who thrives in a fast-paced, energetic environment, is comfortable with ambiguity, and can bridge customer business challenges with Snowflake-based solutions.
The ideal candidate brings a broad technical skill set, spanning data architecture, ETL, security, performance optimization, and analytics, along with strong customer-facing and leadership capabilities.
Key Responsibilities:
Lead and support Snowflake deployments for customers in collaboration with system integrators.
Translate customer business requirements into scalable and secure Snowflake architectures.
Design and execute proof-of-concepts (POCs) and customer demos.
Provide hands-on guidance for architecture design, implementation, and optimization.
Perform performance analysis, tuning, and cost optimization within Snowflake.
Ensure solutions align with best practices for security, governance, and scalability.
Participate across all phases of the project lifecycle-from conception through implementation.
Contribute to technical documentation and knowledge sharing.
Work effectively in Agile development environments.
Required Skills & Qualifications:
Strong experience as a Software / Cloud Architect, with deep hands-on expertise in Snowflake.
Thorough understanding of the Software Development Life Cycle (SDLC).
Strong knowledge of database programming and SQL, including:
Stored Procedures
Functions
Macros
Triggers
Experience building automated and high-performance data transformations.
Solid understanding of ETL/ELT concepts, data modeling, and analytics workloads.
Excellent problem-solving, communication, and documentation skills.
Comfortable working in Agile methodologies.
Demonstrated leadership and ownership mindset with a strong “can-do” attitude.
Results-oriented with the ability to work independently and collaboratively.
Healthcare Data Architect & Analytics Lead
Data engineer job in Austin, TX
Role: Senior Data Architect & Analytics Lead - Healthcare
Minimum Requirements:
10 years Required: Experience in data architecture, data modeling, and data warehousing (on-prem and cloud technologies)
10 years Required: Experience with business intelligence and big data solutions
10 years Required: Experience with healthcare data standards (FHIR, CCDA, HL-7) - would be a plus
10 years Required: Proven experience in leading and managing advanced analytics projects
10 years Required: Experience with cloud platforms such as Azure, AWS, or Google Cloud Platform
10 years Required: Experience in developing, deploying, and managing applications and data storage on cloud platforms
10 years Required: Strong knowledge of data architecture principles and best practices
10 years Required: Proficiency in SQL and other database technologies
10 years Required: Knowledge of advanced analytics tools and strategies
10 years Required: Strong communication skills, with the ability to explain complex technical concepts to non-technical audiences
10 years Required: Ability to lead teams and manage complex projects
10 years Required: Proficiency in cloud infrastructure management, including cloud-based data storage, cloud security, and best practices for deploying applications in the cloud
10 years Required: Develop and maintain enterprise data models, data warehouses, and data lakes to support analytics and reporting
10 years Required: Evaluate and integrate emerging technologies such as AI, machine learning, and big data analytics in healthcare
10 years Required: Work closely with data engineers, developers, and analysts to ensure seamless data accessibility and usability
10 years Required: Experience with ETL tools, data integration platforms, and APIs
10 years Required: Experience with EHR integration, APIs, and real-time streaming
10 years Required: Experience in architecting and designing ETL pipelines using Informatica tools
10 years Required: Design and model data for cloud-based data architecture solutions using Snowflake as the primary data platform
10 years Preferred: Experience in implementing and deploying big data applications
8 years Preferred: Hands-on expertise in Python and/or Java/Scala
8 years Preferred: Experience with structured Enterprise Architecture practices, hybrid cloud deployments, and on-premise-to-cloud migration deployments and roadmaps
8 years Preferred: Healthcare industry experience
8 years Preferred: Fundamental understanding of Information Management principles, IT processes, SDLC, architecture, and organizational technologies
8 years Preferred: Consulting and facilitation skills
8 years Preferred: Customer-focused ability to communicate across all organizational levels
8 years Preferred: Proactive leadership style; self-starter with strong attention to detail
3 years Preferred: The Open Group Architecture Framework (TOGAF) or Cloud Solution Architecture certification on Azure, AWS, and/or GCP
3 years Preferred: Experience in Texas or other state eligibility systems
Senior Data Architect
Data engineer job in Austin, TX
Direct Client: Texas Health and Human Services Commission
Job Title: Senior Data Architect
Duration: 8+Months
Contract
Hours Per Week: 40 Hr
Interview Type: Webcam or In-Person
Ceipal ID: STX_DATA574_MA
Requirement ID: 529601574
Texas Health and Human Services Commission requires the services of 1 Database Architect 3, hereafter referred to as Candidate(s), who meets the general qualifications of Database Architect 3, Data/Database Administration and the specifications outlined in this document for the Texas Health and Human Services Commission.
Designs and builds relational databases. Develops strategies for data acquisitions, archive recovery, and implementation of a database. Cleans and maintains the database by removing and deleting old data. Must be able to design, develop and manipulate database management systems, data warehouses and multidimensional databases. Requires a depth and breadth of database knowledge that shall help with formal design of relational databases and provides insight into strategic data manipulation. Responsible for making sure an organization's strategic goals are optimized through the use of enterprise data standards. This frequently involves creating and maintaining a centralized registry of metadata. Has working knowledge of EIR Accessibility standards and assistive technologies. Ensures that all user interfaces for database administration and inputting, viewing, outputting data(via reports) are compliant with accessibility standards.
The Senior Data Architect & Analytics Lead will play a pivotal role in advancing the data infrastructure and analytics initiatives for the HHSC CDA team. This position is designed for a seasoned professional with in-depth expertise in data design and advanced analytics. Responsibilities include designing and implementing robust data structures to support analytics projects, developing comprehensive database and table designs for optimized data storage and retrieval, and utilizing advanced analytics tools and techniques to extract insights that facilitate decision-making. The successful candidate will collaborate with cross-functional teams to ensure alignment of data strategies with organizational goals and provide thought leadership on data modeling and analytics best practices. The ideal applicant will have proven experience as a Data Architect or similar role, with a strong understanding of data modeling, database design, and data governance principles, alongside familiarity with analytics tools and frameworks. Excellent communication and collaboration skills are essential for effectively working across teams. Join us to be part of a dynamic team dedicated to transforming healthcare through innovative data solutions, where your expertise will significantly contribute to the success of our initiatives at HHSC.
Skills:
10 Required Experience in data architecture, data modeling, and data warehousing on-prem and cloud technologies
10 Required Experience with business intelligence and big data solutions.
10 Required Experience with healthcare data standards like FHIR, CCDA, and HL-7 would be a plus.
10 Required Proven experience in leading and managing advanced analytics projects.
10 Required Experience with cloud platforms such as Azure, AWS, or Google Cloud Platform is required.
10 Required Experience in developing, deploying, and managing applications and data storage on these platforms.
10 Required Strong knowledge of data architecture principles and best practices.
10 Required Proficiency in SQL and other database technologies.
10 Required Knowledge of advanced analytics tools and strategies.
10 Required Strong communication skills, with the ability to explain complex technical concepts to a non-technical audience.
10 Required Ability to lead teams and manage complex projects.
10 Required Proficiency in cloud infrastructure management, including understanding of cloud-based data storage options, cloud security considerations, and best practices for deploying applications in the cloud.
10 Required Develop and maintain enterprise data models, data warehouses, and data lakes to support analytics and reporting.
10 Required Evaluate and integrate emerging technologies such as AI, machine learning, and big data analytics in healthcare.
10 Required Work closely with data engineers, developers, and analysts to ensure seamless data accessibility and usability.
10 Required Experience with ETL tools, data integration platforms, and APIs.
10 Required Experience with EHR integration, APIs, and real-time streaming.
10 Required Experience in architecting and designing ETL pipelines using Informatica tools.
10 Required Design and model data for cloud-based data architecture solutions using Snowflake as the primary data platform.
10 Preferred Experience in implementing and deploying big data applications
8 Preferred Must have hands-on expertise in Python and/or Java/Scala
8 Preferred Experience with structured Enterprise Architecture practices, hybrid cloud deployments, and on-premise-to-cloud migration deployments and roadmaps
8 Preferred Healthcare industry experience preferred
8 Preferred Fundamental understanding of the Information Management principles, IT processes, SDLC, architecture and technologies adopted by an organization.
8 Preferred Consulting and Facilitation Skills.
8 Preferred Customer-focused ability to communicate across all levels of the organization.
8 Preferred Proactive Leadership style; self-starter and strong attention to detail.
3 Preferred The Open Group Architecture Framework (TOGAF), Cloud Solution Architecture on Azure, Amazon Web Services (AWS) and/or Google Cloud Platform (GCP) Certification.
3 Preferred Experience in Texas or other state eligibility system
________________________________________________________________________________________________________________________________________________
V Group Inc. is a NJ-based IT Services and Products Company with its business strategically categorized in various Business Units including Public Sector, Enterprise Solutions, Professional Services, Ecommerce, Projects, and Products. Within Public Sector business unit, we cater IT Professional Services to Federal, State and Local. We have multiple awards/ contracts with 30+ states, including but not limited to NY, CA, FL, GA, MD, MI, NC, OH, OR, CO, CT, TN, PA, TX, VA, NM, VT, and WA.
If you are considering applying for a position with V Group, or in partnering with us on a position, please feel free to contact me for any questions you may have regarding our services and the advantages we can offer you as a consultant.
Please share my contact information with others working in Information Technology.
Website: **************************************
LinkedIn: *****************************************
Facebook: *********************************
Twitter: *********************************
Senior Data Architect
Data engineer job in Austin, TX
Experience in data architecture, data modelling, and data warehousing on-prem and cloud technologies
Experience with business intelligence and big data solutions.
Experience with healthcare data standards like FHIR, CCDA, and HL-7 would be a plus.
Proven experience in leading and managing advanced analytics projects.
Experience with cloud platforms such as Azure, AWS, or Google Cloud Platform is required.
Experience in developing, deploying, and managing applications and data storage on these platforms.
Strong knowledge of data architecture principles and best practices.
Proficiency in SQL and other database technologies.
Knowledge of advanced analytics tools and strategies.
Strong communication skills, with the ability to explain complex technical concepts to a non-technical audience.
Ability to lead teams and manage complex projects.
Proficiency in cloud infrastructure management, including understanding of cloud-based data storage options, cloud security considerations, and best practices for deploying applications in the cloud.
Develop and maintain enterprise data models, data warehouses, and data lakes to support analytics and reporting.
Evaluate and integrate emerging technologies such as AI, machine learning, and big data analytics in healthcare.
Work closely with data engineers, developers, and analysts to ensure seamless data accessibility and usability.
Experience with ETL tools, data integration platforms, and APIs.
Experience with EHR integration, APIs, and real-time streaming.
Experience in architecting and designing ETL pipelines using Informatica tools.
Design and model data for cloud-based data architecture solutions using Snowflake as the primary data platform
Head of Data Science & AI
Data engineer job in Austin, TX
Duration: 6 month contract-to-hire
Compensation: $150K-160K
Work schedule: Monday-Friday (8 AM-5PM CST) - onsite 3x per week
Benefits: This position is eligible for medical, dental, vision and 401(k)
The Head of Data Science & AI leads the organization's data science strategy and team, driving advanced analytics and AI initiatives to deliver business value and innovation. This role sets the strategic direction for data science, ensures alignment with organizational goals, and promotes a data-driven culture. It involves close collaboration with business and technology teams to identify opportunities for leveraging machine learning and AI to improve operations and customer experiences.
Key Responsibilities
Develop and execute a data science strategy and roadmap aligned with business objectives.
Build and lead the data science team, providing mentorship and fostering growth.
Partner with business leaders to identify challenges and deliver actionable insights.
Oversee design and deployment of predictive models, algorithms, and analytical frameworks.
Ensure data integrity, governance, and security in collaboration with engineering teams.
Communicate complex insights to non-technical stakeholders.
Manage infrastructure, tools, and budget for data science initiatives.
Drive experimentation with emerging AI technologies and ensure ethical AI practices.
Oversee full AI model lifecycle: development, deployment, monitoring, and compliance.
Qualifications
8+ years in data science/analytics with leadership experience.
Expertise in Python, R, SQL, and ML frameworks (TensorFlow, PyTorch, Scikit-Learn).
Experience deploying ML models and monitoring performance.
Familiarity with visualization tools (Tableau, Power BI).
Strong knowledge of data governance, advanced statistical methods, and AI trends.
Skills in project management tools (MS Project, JIRA) and software development best practices (CI/CD, Git, Agile).
Please apply directly to be considered.
Senior Software Engineer
Data engineer job in Austin, TX
Sr Software Engineer (Fintech Startup)
Direct Hire W2 (no 3rd parties) - MUST be US Citizen or Green Card Holder
Hybrid - Austin 78701
Required:
5+ years of professional software engineering experience
3+ years in Fintech or Payments
Backend expertise in at Python, Node or Go (No Java)
Strong API development experience
Proven experience designing and scaling cloud-native systems (AWS)
Experience with secure payment processing, reconciliation, and data integrity
Settlement of Ledger accuracy experience
PCI DSS/NACHA/SOC2 implementation experience
Kafka experience
Familiarity with AI/ML model deployment and MLOps best practices
Perks:
100% Company paid benefits (Medical, Dental, Vision)
Competitive base salary + Equity ($150-200k DOE)
Flexible PTO & Hybrid work environment
Annual professional development budget
Senior/Lead Software Engineer (Full-Stack, Rust, DeFi)
Data engineer job in Austin, TX
Senior/Lead Software Engineer (Full-Stack, Rust, DeFi) - Crypto Trading Company
Compensation: $150,000 - $200,000 Base, Total Comp up to $400,000 (with Equity)
Working for a high-growth start-up in the DeFi trading space, building a next-generation on-chain perpetuals trading platform. The focus is on speed, security, and deep liquidity. The company is based in Austin, TX, with plans to expand the team. Small, tight-knit, 10-person team with ex-Coinbase leadership. Token and community launched just two months ago. Already generated $1.3 million in profits.
What You'll Do:
Architect and implement performant on-chain systems, including smart contracts for trading, liquidity, and settlement.
Design secure, gas-efficient, and upgradeable contract systems.
Build backend services for critical functions like order flow, pricing, position data, and funding.
Integrate index price feeds, oracles, and risk-management logic.
Lead technical decisions, code reviews, and help establish best practices for the team.
What We're Looking For (Must-Haves)
Experience: Ideally six years of experience for senior roles, but we are open to exceptional candidates with less experience (down to two or three years) who demonstrate a strong growth potential and track record.
Technical Stack: Proficiency in Rust is important. Experience working full-stack with a strong focus on the backend is required. Familiarity with Solidity is also required.
AI Tool Mastery (Critical Requirement): Professional use of modern AI tools (e.g., Claude Code, Anti-Gravity) is critical. We expect candidates to be highly skilled at leveraging these tools.
Sector Knowledge: Crypto knowledge is important. Candidates must be familiar with foundational crypto concepts and, ideally, with perpetuals, AMMs, derivatives, or on-chain trading mechanics.
Why Join This Team?
Massive Impact: You will be a lead IC, working directly with the founder to architect and build the entire trading system from the ground up.
Be a Pioneer: Join a profitable company that has already hit key milestones early in its lifecycle.
Competitive Compensation: Earn a highly competitive package, with total compensation reaching up to $400,000, including equity.
Principal DevOps Engineer
Data engineer job in Austin, TX
As a Principal DevOps Engineer, you will lead and drive the mission to make our Product delivery velocity a key competitive advantage in the market. You will be a tech lead and SME for the Enterprise DevSecOps Engineering team to drive the technical roadmap and deliverables as well as partner closely with our product and engineering organizations to innovate at the speed of business with quality & security.
Responsibilities:
Drive engineering solutions aligned with our mission to make our developer experience a key competitive advantage in the market
Drives operational excellence for the connected services that delivers a24x7x365 operation. Drives improvements & efficiencies in operational practices, tools & processes
Collect technical requirements from stakeholders across the company and help build a complete CI/CD strategy and objectives.
Provide product and technical leadership, set goals that produce value for the business, and uphold high technical standards for your team.
You will work with the engineering lead to define, communicate, enable, and institutionalize the best DevOps practices for product engineering teams through an enterprise-wide engineering system.
Partner with business and technology stakeholders to construct a Blueprint that articulates a 3-year business and technology aligned target-state and roadmap for the products
Develop standards and practices for deployments and create re-useable interfaces for consumers.
Following market trends and DevOps movements and apply models of continuous improvement to deployment and tooling
Understand the user needs, prioritize by making data-driven decisions
Immerse yourself in the company's business vision and strategy and dive into a mission-critical product area to develop a deep understanding of the business and technology domain, including internal and external actors as well as the collection of end-to-end dependencies.
Drive innovation and automation to simplify challenges and enhance developer productivity & experiences across the enterprise.
Participate in strategic planning to achieve technical and business goals.
Work closely with engineers, Tech leads, Architecture, Security, Platform Eng, Quality Engineering, Cloud, Change & Release Management teams etc.., to architect and deliver the best technical designs and work closely with cross-functional partners and develop prioritized roadmap
Requirements:
12 years of professional SDLC experience in a diverse set of technology disciplines and minimum 5 yrs. of hands-on experience designing and implementing Enterprise CICD solutions. Preferred with Software developer experience in the past.
5+ years of hands-on experience automating cloud-based applications in AWS.
Minimum 2yrs of experience in Orchestration platform (EKS, OpenShift)
Must have worked across breadth and depth of DevOps cycle - Orchestration and Configuration Management, CICD, Monitoring, Security
Handson experience managing enterprise DevOps platforms like GitHub, GitHub Actions, Artifactory, SonarQube, Octopus Deploy, AWS resources, Testing & Security products etc.,
Core Competencies:
Knowledge of basic .Net, Angular & Python Development & Deployment constructs
Ability to learn and quickly absorb new material, Strong troubleshooting skills.
Strong organizational skills and adaptive capacity for rapidly changing priorities and workloads
Ability to work well independently and maintain focus in a highly dynamic work environment.
Have a deep understanding of API design, the difference between platform design and application design, and the tooling to create modern APIs.
Experience managing Dynamic Global configuration for App and environment variables
Experience on many of the DevOps automation capabilities and practices, such as branching strategy, CICD, IaC, everything-as-code, Kubernetes, GitOps, B/G and Canary deployment, etc.
Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams
Senior Software Engineer
Data engineer job in Austin, TX
Develop software solutions by studying information needs, conferring with users, and analyzing system flows, data usage, and work processes. Investigate problem areas and prepare and install solutions by determining and designing system specifications, standards, and programming.
The Software Engineer 3 (contractor) position is part of the IT Administrative Applications Robotic Process Automation (RPA) team and will play a crucial role in designing, developing, implementing, and maintaining RPA solutions. The Software Engineer 3 will perform development duties that include solution design, process automation, infrastructure management, collaboration, and continuous improvement.
This contractor role is for a senior robotics developer experienced in technologies such as Blue Prism, Microsoft Power Automate, and Azure. The role will be responsible for developing, testing, and troubleshooting bots that automate various business processes. Hands-on experience with Blue Prism, Microsoft Power Automate, and Azure is required, along with the ability to troubleshoot and validate HTTP responses. Knowledge of the DOM, HTTP protocol, JavaScript, and HTML is also required. This is a fast-paced project working in an Agile (Kanban/Scrum) environment.
Required Skills and Qualifications
8 years of hands-on RPA development experience, including at least 4 years specifically using Blue Prism
8 years of experience in web development, including HTML, JavaScript, DOM, and HTTP protocol
8 years of experience working with SQL/NoSQL databases
8 years of experience with RESTful web services.
Preferred Skills
4 years of experience with API development
4 years of experience in process discovery, requirements gathering, and solution architecture
4 years of proficiency in programming languages such as C#, Java, or Python
3 years of experience with Azure infrastructure, including troubleshooting
3 years of experience with software development using Agile methodologies
1 year of experience with Power Automate development
1 year of experience supporting Health and Human Services or OIG
Blue Prism Certification
Microsoft Power Platform Developer Certification.
Senior Java Software Engineer
Data engineer job in Austin, TX
URGENT REQUIREMENT
KINDLY DO NOT SHARE C2C Profiles
Role : Java Developer
Duration : 12+ Months W2 contract
Required Skills: Strong Java and Spring Boot experience Experience with REST APIs and Kafka-based event-driven services Proficient with Hibernate and PostgreSQL Familiarity with microservices, Git, and unit testing frameworks (JUnit/Mockito) Problem-solving skills and ability to work in a collaborative environment Agile/Scrum experience