Lead DevOps Engineer (Jenkins)
Data engineer job at PRI Technology
Employment Type: Full-Time, Direct Hire
This is not a contract role and is not available for third-party agencies or contractors.
About the Role
We are seeking a hands-on Lead DevOps Engineer to drive the development of a next-generation enterprise pipeline infrastructure. This role is ideal for a technical leader with deep experience building scalable Jenkins environments, defining CI/CD strategy, and promoting DevOps best practices across large organizations. If you thrive in fast-paced, highly collaborative environments and enjoy solving complex engineering challenges, this is an excellent opportunity.
What You'll Do
Lead the design and implementation of a unified enterprise pipeline framework using Jenkins, Octopus Deploy, and related CI/CD tools.
Build, optimize, and maintain a highly scalable Jenkins platform supporting multiple concurrent teams and workloads.
Evaluate emerging CI/CD technologies and lead enterprise-wide adoption initiatives.
Manage and mentor a team of developers and DevOps engineers; foster a culture of operational excellence.
Collaborate with cross-functional stakeholders to gather requirements, align strategy, and advance DevOps maturity.
Enforce Infrastructure-as-Code practices with proper governance, compliance, and audit controls.
Implement monitoring, alerting, and automation to ensure strong operational performance.
Lead incident response efforts; drive root-cause analysis and long-term remediation.
Identify bottlenecks and drive end-to-end automation to improve deployment speed and reliability.
What You Bring
Strong expertise with Jenkins, Octopus Deploy, and modern CI/CD ecosystems.
Hands-on experience with AWS or Azure, Docker, Kubernetes, Terraform, and IaC principles.
Strong programming skills (Python, Node.js) and solid Git fundamentals.
3+ years of experience leading technical teams and delivering complex solutions.
Experience with Software-Defined Networks, VPCs, cloud networking, and infrastructure automation.
Familiarity with DevOps methodologies and ITIL best practices.
Proactive, collaborative, and driven by innovation.
Software Engineer (Top Secret Clearance Required)
King of Prussia, PA jobs
Description:Who We Are: Want to work with cutting-edge technologies in a fun, fast-paced, and dynamic environment? Are you a problem solver that takes on challenges that others think are impossible? If you answered yes to these questions, then we invite you to join Lockheed Martin Rotary and Missions Systems (RMS) as a Software Engineer!
Lockheed Martin is a global leader in aerospace, defense, and technology solutions. You will be joining Lockheed Martin Rotary and Mission Systems
About Rotary and Mission Systems
Our team in Warfighter Solutions consists of a variety of engineering disciplines that work together to execute design, development, integration and test of systems and have expertise in Software Defined Radios (SDR), digital signal processing (DSP), GNSS constellation simulation, and signal transceivers.
Our team members build strong relationships with one another, founded on trust, open communication, and a sense of camaraderie. This leads to a positive and productive work culture, where team members feel valued, supported, and motivated to contribute to the team's success.
We embody the Lockheed Martin core values.
Learn more about our Culture here
Who You Are:
You thrive in a collaborative, multidisciplinary engineering environment and are committed to delivering best-in-class products and solutions. You have an innovative mindset, and capable of finding solutions to complex engineering challenges.
You have a Bachelors degree from an accredited college in a related discipline, with 2 years of professional experience, a Masters degree, or equivalent experience/combined education.
You currently hold a Top Secret clearance.
The Work:
• As a Software Engineer for Warfighter Solutions, you can help us take on the world's most important and complex challenges by providing solutions to a variety of technical problems. We provide tactical based solutions which specialize in software defined radio, digital signal processing, constellation simulation, and command and control software in order to support our customers' dynamic needs in supporting the warfighter.
• Integrate open and modular hardware and software components in the field of Radio Frequency (RF) systems
• Develop software required to support integration and control of waveform processing applications
• Supporting Integration & Test (I&T) to troubleshoot and resolve software defects, performance issues, and integration challenges.
• Work closely with system engineers to ensure that software development aligns with overall system architecture and requirements, and that testing and validation processes are comprehensive and effective.
• Support current software architecture which includes C++, Linux scripting and Qt GUI development utilizing git for CM control.
• Engage in daily scrum and other weekly agile meetings
• Work closely with the Product Owner, Software Architect, and Chief Engineer to support Customer design reviews and various other milestone events.
Why Work For Us:
Your Health, Your Wealth, Your Life
Our flexible schedules, competitive pay and comprehensive benefits enable you to live a healthy, fulfilling life at and outside of work.
Learn more about Lockheed Martin's competitive and comprehensive benefits package.
Basic Qualifications:
• Security Clearance required prior to start: Top Secret
• Experience developing full stack software using Object Oriented Programming
• Knowledge of modern DevSecOps tools and practices, including Gitlab CI/CD
• Proficiency in developing automated unit, integration, and end-to-end tests
• Experience working with Linux OS, including basic system administration (emphasis on RHEL)
• Desire and ability to quickly learn unfamiliar technologies and domains in a rapid paced, Agile development environment
Desired Skills:
• Experience with C++ Software development
• Experience with Linux operating system
• Experience with software configuration management (e.g. GIT, Jira, Confluence, etc.)
• Experience in Full Stack Development
• Experience developing mock (stub) interfaces for system testing
• Experience in Hardware test and integration
• Experience decomposing requirements / features
• Effective oral and written communication skills
• Methodical approach to problem solving
• Working knowledge of Digital Signal Processing (DSP) concepts, software-defined radio (SDR) frameworks, and real-time systems that include a mixture of firmware, embedded, and traditional software applications
• Experience in the Radio Frequency (RF)
• Experience working with distributed development teams in an agile development environment
• Ability/willingness to travel for test events
Security Clearance Statement: This position requires a government security clearance, you must be a US Citizen for consideration.
Clearance Level: Top Secret
Other Important Information You Should Know
Expression of Interest: By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings.
Ability to Work Remotely: Onsite Full-time: The work associated with this position will be performed onsite at a designated Lockheed Martin facility.
Work Schedules: Lockheed Martin supports a variety of alternate work schedules that provide additional flexibility to our employees. Schedules range from standard 40 hours over a five day work week while others may be condensed. These condensed schedules provide employees with additional time away from the office and are in addition to our Paid Time off benefits.
Schedule for this Position: 4x10 hour day, 3 days off per week
Lockheed Martin is an equal opportunity employer. Qualified candidates will be considered without regard to legally protected characteristics.
The application window will close in 90 days; applicants are encouraged to apply within 5 - 30 days of the requisition posting date in order to receive optimal consideration.
At Lockheed Martin, we use our passion for purposeful innovation to help keep people safe and solve the world's most complex challenges. Our people are some of the greatest minds in the industry and truly make Lockheed Martin a great place to work.
With our employees as our priority, we provide diverse career opportunities designed to propel, develop, and boost agility. Our flexible schedules, competitive pay, and comprehensive benefits enable our employees to live a healthy, fulfilling life at and outside of work. We place an emphasis on empowering our employees by fostering an inclusive environment built upon integrity and corporate responsibility.
If this sounds like a culture you connect with, you're invited to apply for this role. Or, if you are unsure whether your experience aligns with the requirements of this position, we encourage you to search on Lockheed Martin Jobs, and apply for roles that align with your qualifications.
Experience Level: Experienced Professional
Business Unit: RMS
Relocation Available: Possible
Career Area: Software Engineering
Type: Full-Time
Shift: First
Software Engineer Senior
Orlando, FL jobs
What We're Doing Rotary and Mission Systems' Training, Logistics and Simulation (TLS) business is Lockheed Martin's center of excellence for training and logistics products and services, serving the U.S. military and more than 65 international customers around the world. Based in Orlando, TLS develops programs that teach service men and women skills to accomplish their most challenging missions - flying the world's most advanced fighter aircraft, navigating ships and driving armored vehicles.
TLS is the corporation's hub for simulation, X reality, live-virtual-constructive capabilities, advanced training devices and full-service training programs. TLS also provides sustainment services such as supply chain and logistics IT solutions, spares and repairs, as well as automated test and support equipment.
THE WORK
This is a position for a Software Engineering Senior on our F35 Pilot Training Devices (PTD) Team.
As a key member of our Software Engineering team, you will:
• Design, develop, and maintain CI/CD pipelines using tools such as Jenkins, GitLab CI/CD, and Azure DevOps
• Containerize applications using Docker and Podman
• Develop and maintain scripts using languages such as Python, Bash, and PowerShell
• Collaborate with development teams to ensure smooth integration of code changes into the CI/CD pipeline
• Troubleshoot and resolve issues with the CI/CD pipeline, including debugging and optimizing pipeline performance
• Ensure compliance with security and regulatory requirements, including implementing security scanning and vulnerability management tools
• Develop and maintain documentation for CI/CD pipelines, including pipeline architecture, configuration, and troubleshooting guides.
This position will require the selected candidate to have or obtain an Interim Secret level U.S. government security clearance before starting with Lockheed Martin. U.S. citizenship is a requirement for consideration.
Why Join Us
Join us if you are passionate about saving lives through mission readiness. Be a part of a team that values speed, agility, affordability, and disruptive technologies.
If you are excited about transforming sustainment and training solutions and working with a talented team to reimagine the future, we invite you to contribute your skills and technical expertise to our mission.
Basic Qualifications:
• Bachelor's degree
• 5 or more years of experience in software development, with a focus on full stack web development and DevOps
• Experience developing, debugging, and maintaining GitLab CI/CD pipelines
• Experience with containerization and using tools such as Docker or Podman
• Experience with scripting in languages such as Bash, PowerShell, and Python
• Experience with Infrastructure As Code (IaC) and writing Ansible playbooks
• Experience with container orchestration via Kubernetes or Openshift
• Strong experience with object-oriented programming languages (such as C++, C#, Python, Ruby, Objective-C)
Desired Skills:
• Master's degree
• Advanced Expertise in GitLab CI/CD, including advanced pipeline configuration, job artifacts, and dependency management
• Advanced Expertise with GitLab Runner, including installation, configuration, and management of runners
• Advanced Expertise with Python and bash scripting
Security Clearance Statement: This position requires a government security clearance, you must be a US Citizen for consideration.
Clearance Level: Secret with Investigation or CV date within 5 years
Other Important Information You Should Know
Expression of Interest: By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings.
Ability to Work Remotely: Part-time Remote Telework: The employee selected for this position will work part of their work schedule remotely and part of their work schedule at a designated Lockheed Martin facility. The specific weekly schedule will be discussed during the hiring process.
Work Schedules: Lockheed Martin supports a variety of alternate work schedules that provide additional flexibility to our employees. Schedules range from standard 40 hours over a five day work week while others may be condensed. These condensed schedules provide employees with additional time away from the office and are in addition to our Paid Time off benefits.
Schedule for this Position: 4x10 hour day, 3 days off per week
Lockheed Martin is an equal opportunity employer. Qualified candidates will be considered without regard to legally protected characteristics.
The application window will close in 90 days; applicants are encouraged to apply within 5 - 30 days of the requisition posting date in order to receive optimal consideration.
At Lockheed Martin, we use our passion for purposeful innovation to help keep people safe and solve the world's most complex challenges. Our people are some of the greatest minds in the industry and truly make Lockheed Martin a great place to work.
With our employees as our priority, we provide diverse career opportunities designed to propel, develop, and boost agility. Our flexible schedules, competitive pay, and comprehensive benefits enable our employees to live a healthy, fulfilling life at and outside of work. We place an emphasis on empowering our employees by fostering an inclusive environment built upon integrity and corporate responsibility.
If this sounds like a culture you connect with, you're invited to apply for this role. Or, if you are unsure whether your experience aligns with the requirements of this position, we encourage you to search on Lockheed Martin Jobs, and apply for roles that align with your qualifications.
Experience Level: Experienced Professional
Business Unit: RMS
Relocation Available: Possible
Career Area: Software Engineering
Type: Full-Time
Shift: First
Software Engineer
King of Prussia, PA jobs
Description:Lockheed Martin Space is looking for Software Engineers for multiple programs. We are hiring levels 2-5. If you are interested in joining LM as a Software Engineer, please apply to this requisition.
By applying to this posting, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings.
Basic Qualifications:
• Design, development and testing of software products utilized in the development and/or refurbishment of ground support/test equipment.
• Perform software engineering lifecycle following the program Software Development Plan (SDP) to include requirements, analysis, unit test, integration and support to formal test and delivery.
• Participate in technical reviews and audits of software products.
• Be part of a dynamic team, utilizing software development best practices and processes.
• Strong communication skills, a results-oriented mindset, and the ability to work collaboratively are essential for this role.
• Must have a current Top Secret clearance, thus US citizenship is required.
Desired Skills:
Experience working with Microsoft SQL server databases or similar databases and a scripting language.
Understanding of industrial security regulations and procedures, including experience administering technical security provisions of the ICD, NISPOM, DOD 5205.07 Special Access Program (SAP) Security Manuals and UL related specifications.
Experience working with DMP alarm system and Lenel Onguard access control\\alarm system.
Experience working with a central station alarm monitoring system or physical security information system (PSIM) such as Bold manitou PSIM.
Experience working with statements of work and quote evaluations.
Experience working with Microsoft SQL Server Reporting Services and T-SQL and\\or Tableau reporting services.
Experience working with Microsoft SQL Integration Services.
Experience working with Microsoft Visio Studio development environment and Azure configuration management tools.
Security Clearance Statement: This position requires a government security clearance, you must be a US Citizen for consideration.
Clearance Level: Top Secret
Other Important Information You Should Know
Expression of Interest: By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings.
Ability to Work Remotely: Onsite Full-time: The work associated with this position will be performed onsite at a designated Lockheed Martin facility.
Work Schedules: Lockheed Martin supports a variety of alternate work schedules that provide additional flexibility to our employees. Schedules range from standard 40 hours over a five day work week while others may be condensed. These condensed schedules provide employees with additional time away from the office and are in addition to our Paid Time off benefits.
Schedule for this Position: 9x80 every other Friday off
Pay Rate: The annual base salary range for this position in California, Massachusetts, and New York (excluding most major metropolitan areas), Colorado, Hawaii, Illinois, Maryland, Minnesota, New Jersey, Vermont, Washington or Washington DC is $93,200 - $164,450. For states not referenced above, the salary range for this position will reflect the candidate's final work location. Please note that the salary information is a general guideline only. Lockheed Martin considers factors such as (but not limited to) scope and responsibilities of the position, candidate's work experience, education/ training, key skills as well as market and business considerations when extending an offer.
Benefits offered: Medical, Dental, Vision, Life Insurance, Short-Term Disability, Long-Term Disability, 401(k) match, Flexible Spending Accounts, EAP, Education Assistance, Parental Leave, Paid time off, and Holidays.
(Washington state applicants only) Non-represented full-time employees: accrue at least 10 hours per month of Paid Time Off (PTO) to be used for incidental absences and other reasons; receive at least 90 hours for holidays. Represented full time employees accrue 6.67 hours of Vacation per month; accrue up to 52 hours of sick leave annually; receive at least 96 hours for holidays. PTO, Vacation, sick leave, and holiday hours are prorated based on start date during the calendar year.
This position is incentive plan eligible.
Lockheed Martin is an equal opportunity employer. Qualified candidates will be considered without regard to legally protected characteristics.
The application window will close in 90 days; applicants are encouraged to apply within 5 - 30 days of the requisition posting date in order to receive optimal consideration.
At Lockheed Martin, we use our passion for purposeful innovation to help keep people safe and solve the world's most complex challenges. Our people are some of the greatest minds in the industry and truly make Lockheed Martin a great place to work.
With our employees as our priority, we provide diverse career opportunities designed to propel, develop, and boost agility. Our flexible schedules, competitive pay, and comprehensive benefits enable our employees to live a healthy, fulfilling life at and outside of work. We place an emphasis on empowering our employees by fostering an inclusive environment built upon integrity and corporate responsibility.
If this sounds like a culture you connect with, you're invited to apply for this role. Or, if you are unsure whether your experience aligns with the requirements of this position, we encourage you to search on Lockheed Martin Jobs, and apply for roles that align with your qualifications.
Experience Level: Experienced Professional
Business Unit: SPACE
Relocation Available: Possible
Career Area: Software Engineering
Type: Full-Time
Shift: First
Senior Software Engineer - Top Secret Clearance Required
King of Prussia, PA jobs
Description:Space is a critical domain, connecting our technologies, our security and our humanity. While others view space as a destination, we see it as a realm of possibilities, where we can do more - we can innovate, invest, inspire and integrate our capabilities to transform the future.
At Lockheed Martin Space, we aim to harness the full potential of space to cultivate innovation, reduce costs, and push the boundaries of what technology can achieve. We're creating future-ready solutions, focusing on resiliency and urgency through our 21st Century Security vision. We're erasing boundaries and forming partnerships across industries and around the world. We're advancing spacecraft and the workforce to fuel the next generation. And we're reimagining how space can connect us, ensuring security and prosperity.
Join us in shaping a new era in space and find a career that's built for you.
Within LM Space, the Black Canyon Program is looking for a highly motivated individual to join the program area as a Senior Software Engineer to support the development team. As part of an Agile team, you will have the opportunity to work on a variety of tasks in various areas across the mission and collaborate with the development teams, customers, and subject matter experts.
If you are looking for a challenging, collaborative, fast paced environment then this is the position for you.
Note: This position requires a government security clearance, US citizenship is a requirement for consideration.
In this position you will:
Serve as a software developer on an Agile team developing web applications in an open architecture infrastructure to serve in an exciting new mission space.
Leverage industry standard open-source software solutions such as GitLab, Kubernetes, Docker, or similar platforms/products.
Maintain Kubernetes clusters, create and automate deployment of containers using Helm, and support agile development teams by developing and maintaining tools, pipelines, scripts, and environments.
Participate in daily scrums, software sprint/release planning, demos, and retrospectives.
Develop & demonstrate software capabilities to both internal and external partners.
#LI-CS1
Basic Qualifications:
Experienced in software development using at least one of the following programming or scripting languages: Java, JavaScript, C, C#, Python
Experience working in an Agile development environment, including tools such as Jira and Confluence
Must have an current, active Top Secret clearance
Desired Skills:
• Full development lifecycle experience
• Experience with the frameworks .NET Core and Angular
• Experience working as a member of an Agile Scrum or Kanban team
• Experience with databases (PostgreSQL & Redis)
• Experience with GitLab CI/CD - Deployment pipelines, automated build, and/or configuration tools
• Experience will cloud native development (Docker, Kubernetes, Helm)
• Experience with containerization and container management tools (e.g. Docker, Kubernetes)
• Experience with DevSecOps
• Experience with Git, JIRA, Confluence
• Ability to communicate effectively and work in a collaborative environment
• Familiarity with the Space Domain
• Experience producing quality documentation
Security Clearance Statement: This position requires a government security clearance, you must be a US Citizen for consideration.
Clearance Level: TS/SCI
Other Important Information You Should Know
Expression of Interest: By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings.
Ability to Work Remotely: Onsite Full-time: The work associated with this position will be performed onsite at a designated Lockheed Martin facility.
Work Schedules: Lockheed Martin supports a variety of alternate work schedules that provide additional flexibility to our employees. Schedules range from standard 40 hours over a five day work week while others may be condensed. These condensed schedules provide employees with additional time away from the office and are in addition to our Paid Time off benefits.
Schedule for this Position: 9x80 every other Friday off
Lockheed Martin is an equal opportunity employer. Qualified candidates will be considered without regard to legally protected characteristics.
The application window will close in 90 days; applicants are encouraged to apply within 5 - 30 days of the requisition posting date in order to receive optimal consideration.
At Lockheed Martin, we use our passion for purposeful innovation to help keep people safe and solve the world's most complex challenges. Our people are some of the greatest minds in the industry and truly make Lockheed Martin a great place to work.
With our employees as our priority, we provide diverse career opportunities designed to propel, develop, and boost agility. Our flexible schedules, competitive pay, and comprehensive benefits enable our employees to live a healthy, fulfilling life at and outside of work. We place an emphasis on empowering our employees by fostering an inclusive environment built upon integrity and corporate responsibility.
If this sounds like a culture you connect with, you're invited to apply for this role. Or, if you are unsure whether your experience aligns with the requirements of this position, we encourage you to search on Lockheed Martin Jobs, and apply for roles that align with your qualifications.
Experience Level: Experienced Professional
Business Unit: SPACE
Relocation Available: Possible
Career Area: Software Engineering
Type: Full-Time
Shift: First
Software Engineer, Level 2 - Clearance Required
King of Prussia, PA jobs
Description:Space is a critical domain, connecting our technologies, our security and our humanity. While others view space as a destination, we see it as a realm of possibilities, where we can do more - we can innovate, invest, inspire and integrate our capabilities to transform the future.
At Lockheed Martin Space, we aim to harness the full potential of space to cultivate innovation, reduce costs, and push the boundaries of what technology can achieve. We're creating future-ready solutions, focusing on resiliency and urgency through our 21st Century Security vision. We're erasing boundaries and forming partnerships across industries and around the world. We're advancing spacecraft and the workforce to fuel the next generation. And we're reimagining how space can connect us, ensuring security and prosperity.
Join us in shaping a new era in space and find a career that's built for you.
Within the Victor Portfolio of Space Security, we are seeking a highly skilled, motivated, and experienced Software Engineer to join our dynamic team. This exciting position will fill a critical need in the new and emerging area of protecting space assets. As part of an Agile team, you will have the opportunity to work on a variety of tasks in various areas across the mission and collaborate with the development teams, contractors, customers, and subject matter experts.
As a key member of the software development team, you will be part of the entire engineering life-cycle, and apply your Software Engineering background to develop web applications in an open architecture infrastructure. If you are looking for a new position in a fast-paced team environment, where every day brings a new challenge, and want to support a critical space security mission, then this is the place for you.
In this position you will:
• Serve as a software developer on an Agile team tasked with developing front and back end services for cutting edge web applications.
• Leverage industry standard open-source software solutions such as GitLab, Kubernetes, Docker,
or similar platforms/products.
• Maintain Kubernetes clusters, create and automate deployment of containers using Helm, and support agile development teams by developing and maintaining tools, pipelines, scripts, and environments.
• Participate in daily scrums, software sprint/release planning, demos, and retrospectives.
• Develop & demonstrate software capabilities to both internal and external partners.
Successful Applicants will demonstrate effective communication skills, desire challenges and be willing to engage in frequent internal interactions with peers, teammates, and customers.
Selected applicants must meet eligibility requirements for access to classified information, and must maintain the required security throughout the course of employment.
This is a level 2 role where successful Applicants will:
typically possess a minimum of 2+ years of professional experience and a BS degree.
#LI-CS1
Basic Qualifications:
• Bachelor's degree in a related discipline, such as Computer Science, Software Engineering or other engineering disciplines or equivalent experience
• Experienced in software development using programming or scripting languages such as Java, JavaScript, C#, Python
• Experience working in an Agile development environment, including tools such as Jira and Confluence
• This position requires a Top Secret clearance with investigation date within 5 years clearance prior to start with ability to obtain an SCI
Desired Skills:
• Experience with cloud native development (e.g. Docker, Kubernetes, Helm)
• Experience with the frameworks such as .NET Core, Angular, etc.
• Experience with GitLab CI/CD - Deployment pipelines, automated build, and/or configuration tools
• Experience with databases (PostgreSQL & Redis)
• Strong knowledge of modern technology frameworks, cloud platforms (AWS, Azure, GCP), microservices, APIs, DevOps, and data architecture.
• Active TS/SCI security clearance
#LMSpaceSoftwareEng
Security Clearance Statement: This position requires a government security clearance, you must be a US Citizen for consideration.
Clearance Level: TS/SCI
Other Important Information You Should Know
Expression of Interest: By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings.
Ability to Work Remotely: Onsite Full-time: The work associated with this position will be performed onsite at a designated Lockheed Martin facility.
Work Schedules: Lockheed Martin supports a variety of alternate work schedules that provide additional flexibility to our employees. Schedules range from standard 40 hours over a five day work week while others may be condensed. These condensed schedules provide employees with additional time away from the office and are in addition to our Paid Time off benefits.
Schedule for this Position: 4x10 hour day, 3 days off per week
Lockheed Martin is an equal opportunity employer. Qualified candidates will be considered without regard to legally protected characteristics.
The application window will close in 90 days; applicants are encouraged to apply within 5 - 30 days of the requisition posting date in order to receive optimal consideration.
At Lockheed Martin, we use our passion for purposeful innovation to help keep people safe and solve the world's most complex challenges. Our people are some of the greatest minds in the industry and truly make Lockheed Martin a great place to work.
With our employees as our priority, we provide diverse career opportunities designed to propel, develop, and boost agility. Our flexible schedules, competitive pay, and comprehensive benefits enable our employees to live a healthy, fulfilling life at and outside of work. We place an emphasis on empowering our employees by fostering an inclusive environment built upon integrity and corporate responsibility.
If this sounds like a culture you connect with, you're invited to apply for this role. Or, if you are unsure whether your experience aligns with the requirements of this position, we encourage you to search on Lockheed Martin Jobs, and apply for roles that align with your qualifications.
Experience Level: Experienced Professional
Business Unit: SPACE
Relocation Available: Possible
Career Area: Software Engineering
Type: Full-Time
Shift: First
Data Engineer
New York, NY jobs
DL Software produces Godel, a financial information and trading terminal.
Role Description
This is a full-time, on-site role based in New York, NY, for a Data Engineer. The Data Engineer will design, build, and maintain scalable data systems and pipelines. Responsibilities include data modeling, developing and managing ETL workflows, optimizing data storage solutions, and supporting data warehousing initiatives. The role also involves collaborating with cross-functional teams to improve data accessibility and analytics capabilities.
Qualifications
Strong proficiency in Data Engineering and Data Modeling
Mandatory: strong experience in global financial instruments including equities, fixed income, options and exotic asset classes
Strong Python background
Expertise in Extract, Transform, Load (ETL) processes and tools
Experience in designing, managing, and optimizing Data Warehousing solutions
Azure Data Engineer
Weehawken, NJ jobs
· Expert level skills writing and optimizing complex SQL
· Experience with complex data modelling, ETL design, and using large databases in a business environment
· Experience with building data pipelines and applications to stream and process datasets at low latencies
· Fluent with Big Data technologies like Spark, Kafka and Hive
· Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required
· Designing and building of data pipelines using API ingestion and Streaming ingestion methods
· Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential
· Experience in developing NO SQL solutions using Azure Cosmos DB is essential
· Thorough understanding of Azure and AWS Cloud Infrastructure offerings
· Working knowledge of Python is desirable
· Designing and implementing scalable and secure data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services
· Managing and optimizing data storage using Azure Data Lake Storage, Azure SQL Data Warehouse, and Azure Cosmos DB
· Monitoring and troubleshooting data-related issues within the Azure environment to maintain high availability and performance
· Implementing data security measures, including encryption, access controls, and auditing, to protect sensitive information
· Automating data pipelines and workflows to streamline data ingestion, processing, and distribution tasks
· Utilizing Azure's analytics services, such as Azure Synapse Analytics, to provide insights and support data-driven decision-making.
· Documenting data procedures, systems, and architectures to maintain clarity and ensure compliance with regulatory standards
· Providing guidance and support for data governance, including metadata management, data lineage, and data cataloging
Best Regards,
Dipendra Gupta
Technical Recruiter
*****************************
Data Engineer (Web Scraping technologies)
New York, NY jobs
Title: Data Engineer (Web Scraping technologies)
Duration: FTE/Perm
Salary: 125-190k plus bonus
Responsibilities:
Utilize AI Models, Code, Libraries or applications to enable a scalable Web Scraping capability
Web Scraping Request Management including intake, assessment, accessing sites to scrape, utilizing tools to scrape, storage of scrape, validation and entitlement to users
Fielding Questions from users about the scrapes and websites
Coordinating with Compliance on approvals and TOU reviews
Some Experience building Data pipelines in AWS platform utilizing existing tools like Cron, Glue, Eventbridge, Python based ETL, AWS Redshift
Normalizing/standardizing vendor data, firm data for firm consumption
Implement data quality checks to ensure reliability and accuracy of scraped data
Coordinate with Internal teams on delivery, access, requests, support
Promote Data Engineering best practices
Required Skills and Qualifications:
Bachelor's degree in computer science, Engineering, Mathematics or related field
2-5 experience in a similar role
Prior buy side experience is strongly preferred (Multi-Strat/Hedge Funds)
Capital markets experience is necessary with good working knowledge of reference data across asset classes and experience with trading systems
AWS cloud experience with commons services (S3, lambda, cron, Event Bridge etc.)
Experience with web-scraping frameworks (Scrapy, BeautifulSoup, Selenium, Playwright etc.)
Strong hands-on skills with NoSQL and SQL databases, programming in Python, data pipeline orchestration tools and analytics tools
Familiarity with time series data and common market data sources (Bloomberg, Refinitiv etc.)
Familiarity with modern Dev Ops practices and infrastructure-as-code tools (e.g. Terraform, CloudFormation)
Strong communication skills to work with stakeholders across technology, investment, and operations teams.
Data Engineer
New York, NY jobs
Our client is seeking a Data Engineer with hands-on experience in Web Scraping technologies to help build and scale a new scraping capability within their Data Engineering team. This role will work directly with Technology, Operations, and Compliance to source, structure, and deliver alternative data from websites, APIs, files, and internal systems. This is a unique opportunity to shape a new service offering and grow into a senior engineering role as the platform evolves.
Responsibilities
Develop scalable Web Scraping solutions using AI-assisted tools, Python frameworks, and modern scraping libraries.
Manage the full lifecycle of scraping requests, including intake, feasibility assessment, site access evaluation, extraction approach, data storage, validation, entitlement, and ongoing monitoring.
Coordinate with Compliance to review Terms of Use, secure approvals, and ensure all scrapes adhere to regulatory and internal policy guidelines.
Build and support AWS-based data pipelines using tools such as Cron, Glue, EventBridge, Lambda, Python ETL, and Redshift.
Normalize and standardize raw, vendor, and internal datasets for consistent consumption across the firm.
Implement data quality checks and monitoring to ensure the reliability, historical continuity, and operational stability of scraped datasets.
Provide operational support, troubleshoot issues, respond to inquiries about scrape behavior or data anomalies, and maintain strong communication with users.
Promote data engineering best practices, including automation, documentation, repeatable workflows, and scalable design patterns.
Required Qualifications
Bachelor's degree in Computer Science, Engineering, Mathematics, or related field.
2-5 years of experience in a similar Data Engineering or Web Scraping role.
Capital markets knowledge with familiarity across asset classes and experience supporting trading systems.
Strong hands-on experience with AWS services (S3, Lambda, EventBridge, Cron, Glue, Redshift).
Proficiency with modern Web Scraping frameworks (Scrapy, BeautifulSoup, Selenium, Playwright).
Strong Python programming skills and experience with SQL and NoSQL databases.
Familiarity with market data and time series datasets (Bloomberg, Refinitiv) is a plus.
Experience with DevOps/IaC tooling such as Terraform or CloudFormation is desirable.
Cloud Data Engineer
New York, NY jobs
Title: Enterprise Data Management - Data Cloud, Senior Developer I
Duration: FTE/Permanent
Salary: 130-165k
The Data Engineering team oversees the organization's central data infrastructure, which powers enterprise-wide data products and advanced analytics capabilities in the investment management sector. We are seeking a senior cloud data engineer to spearhead the architecture, development, and rollout of scalable, reusable data pipelines and products, emphasizing the creation of semantic data layers to support business users and AI-enhanced analytics. The ideal candidate will work hand-in-hand with business and technical groups to convert intricate data needs into efficient, cloud-native solutions using cutting-edge data engineering techniques and automation tools.
Responsibilities:
Collaborate with business and technical stakeholders to collect requirements, pinpoint data challenges, and develop reliable data pipeline and product architectures.
Design, build, and manage scalable data pipelines and semantic layers using platforms like Snowflake, dbt, and similar cloud tools, prioritizing modularity for broad analytics and AI applications.
Create semantic layers that facilitate self-service analytics, sophisticated reporting, and integration with AI-based data analysis tools.
Build and refine ETL/ELT processes with contemporary data technologies (e.g., dbt, Python, Snowflake) to achieve top-tier reliability, scalability, and efficiency.
Incorporate and automate AI analytics features atop semantic layers and data products to enable novel insights and process automation.
Refine data models (including relational, dimensional, and semantic types) to bolster complex analytics and AI applications.
Advance the data platform's architecture, incorporating data mesh concepts and automated centralized data access.
Champion data engineering standards, best practices, and governance across the enterprise.
Establish CI/CD workflows and protocols for data assets to enable seamless deployment, monitoring, and versioning.
Partner across Data Governance, Platform Engineering, and AI groups to produce transformative data solutions.
Qualifications:
Bachelor's or Master's in Computer Science, Information Systems, Engineering, or equivalent.
10+ years in data engineering, cloud platform development, or analytics engineering.
Extensive hands-on work designing and tuning data pipelines, semantic layers, and cloud-native data solutions, ideally with tools like Snowflake, dbt, or comparable technologies.
Expert-level SQL and Python skills, plus deep familiarity with data tools such as Spark, Airflow, and cloud services (e.g., Snowflake, major hyperscalers).
Preferred: Experience containerizing data workloads with Docker and Kubernetes.
Track record architecting semantic layers, ETL/ELT flows, and cloud integrations for AI/analytics scenarios.
Knowledge of semantic modeling, data structures (relational/dimensional/semantic), and enabling AI via data products.
Bonus: Background in data mesh designs and automated data access systems.
Skilled in dev tools like Azure DevOps equivalents, Git-based version control, and orchestration platforms like Airflow.
Strong organizational skills, precision, and adaptability in fast-paced settings with tight deadlines.
Proven self-starter who thrives independently and collaboratively, with a commitment to ongoing tech upskilling.
Bonus: Exposure to BI tools (e.g., Tableau, Power BI), though not central to the role.
Familiarity with investment operations systems (e.g., order management or portfolio accounting platforms).
Senior Data Engineer
Philadelphia, PA jobs
We are seeking a passionate and skilled Senior Data Engineer to join our dynamic team in Philadelphia, PA. In this role, you will lead the design and implementation of advanced data pipelines for Business Intelligence (BI) and reporting. Your expertise will transform complex data into actionable insights, driving significant business value for our clients.
Key Responsibilities:
Design and implement scalable and efficient data pipelines for BI and reporting.
Define and manage key business metrics, build automated dashboards, and develop analytic self-service capabilities.
Write comprehensive technical documentation to outline data solutions and architectures.
Lead requirements gathering, solution design, and implementation for data projects.
Develop and maintain ETL frameworks for large real-world data (RWD) assets.
Mentor and guide technical teams, fostering a culture of innovation.
Stay updated with new technologies and solve complex data problems.
Facilitate the deployment and integration of AI models, ensuring data quality and compatibility with existing analytics infrastructure.
Collaborate with cross-functional stakeholders to understand data needs and deliver impactful analytics and reports.
Required Qualifications:
Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
4+ years of SQL experience.
Experience with data modeling, warehousing, and building ETL pipelines.
Proficiency in at least one modern scripting or programming language (e.g., Python, Java, Scala, NodeJS).
Experience working directly with business stakeholders to align data solutions with business needs.
Working knowledge of Snowflake as a data warehousing solution.
Experience with workflow orchestration tools like Apache Airflow.
Knowledge of data transformation tools and frameworks such as dbt (Data Build Tool), PySpark, or Snowpark.
Experience with open-source table formats (e.g., Apache Iceberg, Delta, Hudi).
Familiarity with container technologies like Docker and Kubernetes.
Experience with on-premises and cloud MDM deployments.
Preferred Qualifications:
Proficiency with data visualization tools (e.g., Tableau, Power BI, Quicksight).
Certifications in Snowflake or Azure Data Engineering
Experience with Agile methodologies and project management tools (e.g., Jira).
Experience deploying and managing data solutions within Azure AI, Azure ML, or similar environments.
Familiarity with DevOps practices, particularly CI/CD for data solutions.
Knowledge of emerging data architectures, including Data Mesh, Data Fabric, Multimodal Data Management, and AI/ML integration.
Familiarity with ETL tools like Informatica and Matillion.
Previous experience in professional services or consultancy environments.
Experience in technical pre-sales, solution demos, and proposal development.
Azure Data Engineer
Princeton, NJ jobs
We are seeking an experienced Azure Data Engineer with strong expertise in modern data platform technologies including Azure Synapse, Microsoft Fabric, SQL Server, Azure Storage, Azure Data Factory (ADF), Python, Power BI, and Azure OpenAI. The ideal candidate will design, build, and optimize scalable data pipelines and analytics solutions to support enterprise-wide reporting, AI, and data integration initiatives.
Key Responsibilities.
Design, develop, and maintain Azure-based data pipelines using ADF, Synapse Pipelines, and Fabric Dataflows.
Build and optimize data warehouses, data lakes, and lakehouse architectures on Azure.
Develop complex SQL queries, stored procedures, and data transformations in SQL Server and Synapse SQL Pools.
Implement data ingestion, transformation, and orchestration solutions using Python and Azure services.
Manage and optimize Azure Storage solutions (ADLS Gen2, Blob Storage).
Leverage Power BI and Fabric for data modeling, dataset creation, and dashboard/report development.
Integrate and utilize Azure OpenAI for data enrichment, intelligent automation, and advanced analytics where applicable.
Ensure data quality, data governance, and security best practices across the data lifecycle.
Troubleshoot data pipeline issues, optimize performance, and support production workloads.
Collaborate with data architects, analysts, BI developers, and business stakeholders to deliver end-to-end data solutions.
Data Engineer
Jersey City, NJ jobs
ONLY LOCALS TO NJ/NY - NO RELOCATION CANDIDATES
Skillset: Data Engineer
Must Haves: Python, PySpark, AWS - ECS, Glue, Lambda, S3
Nice to Haves: Java, Spark, React Js
Interview Process: Interview Process: 2 rounds, 2nd will be on site
You're ready to gain the skills and experience needed to grow within your role and advance your career - and we have the perfect software engineering opportunity for you.
As a Data Engineer III - Python / Spark / Data Lake at JPMorgan Chase within the Consumer and Community Bank , you will be a seasoned member of an agile team, tasked with designing and delivering reliable data collection, storage, access, and analytics solutions that are secure, stable, and scalable. Your responsibilities will include developing, testing, and maintaining essential data pipelines and architectures across diverse technical areas, supporting various business functions to achieve the firm's business objectives.
Job responsibilities:
• Supports review of controls to ensure sufficient protection of enterprise data.
• Advises and makes custom configuration changes in one to two tools to generate a product at the business or customer request.
• Updates logical or physical data models based on new use cases.
• Frequently uses SQL and understands NoSQL databases and their niche in the marketplace.
• Adds to team culture of diversity, opportunity, inclusion, and respect.
• Develop enterprise data models, Design/ develop/ maintain large-scale data processing pipelines (and infrastructure), Lead code reviews and provide mentoring thru the process, Drive data quality, Ensure data accessibility (to analysts and data scientists), Ensure compliance with data governance requirements, and Ensure business alignment (ensure data engineering practices align with business goals).
• Supports review of controls to ensure sufficient protection of enterprise data
Required qualifications, capabilities, and skills
• Formal training or certification on data engineering concepts and 2+ years applied experience
• Experience across the data lifecycle, advanced experience with SQL (e.g., joins and aggregations), and working understanding of NoSQL databases
• Experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis
• Extensive experience in AWS, design, implementation, and maintenance of data pipelines using Python and PySpark.
• Proficient in Python and PySpark, able to write and execute complex queries to perform curation and build views required by end users (single and multi-dimensional).
• Proven experience in performance and tuning to ensure jobs are running at optimal levels and no performance bottleneck.
• Advanced proficiency in leveraging Gen AI models from Anthropic (or OpenAI, or Google) using APIs/SDKs
• Advanced proficiency in cloud data lakehouse platform such as AWS data lake services, Databricks or Hadoop, relational data store such as Postgres, Oracle or similar, and at least one NOSQL data store such as Cassandra, Dynamo, MongoDB or similar
• Advanced proficiency in Cloud Data Warehouse Snowflake, AWS Redshift
• Advanced proficiency in at least one scheduling/orchestration tool such as Airflow, AWS Step Functions or similar
• Proficiency in Unix scripting, data structures, data serialization formats such as JSON, AVRO, Protobuf, or similar, big-data storage formats such as Parquet, Iceberg, or similar, data processing methodologies such as batch, micro-batching, or stream, one or more data modelling techniques such as Dimensional, Data Vault, Kimball, Inmon, etc., Agile methodology, TDD or BDD and CI/CD tools.
Preferred qualifications, capabilities, and skills
• Knowledge of data governance and security best practices.
• Experience in carrying out data analysis to support business insights.
• Strong Python and Spark
Data Engineer
Austin, TX jobs
We are seeking a Data Engineer to join a dynamic Agile team and support the build and enhancement of a large-scale data integration hub. This role requires hands-on experience in data acquisition, ETL automation, SQL development, and performance analytics.
What You'll Do
✔ Lead technical work within Agile development teams
✔ Automate ETL processes using Informatica Power Center / IICS
✔ Develop complex Oracle/Snowflake SQL scripts & views
✔ Integrate data from multiple sources (Oracle, SQL Server, Excel, Access, PDF)
✔ Support CI/CD and deployment processes
✔ Produce technical documentation, diagrams & mockups
✔ Collaborate with architects, engineers & business stakeholders
✔ Participate in Sprint ceremonies & requirements sessions
✔ Ensure data quality, validation & accuracy
Must Have Experience
✅ 8+ years:
Informatica Power Center / IICS
ETL workflow development
SQL development (Oracle/Snowflake)
Data warehousing & analytics
Technical documentation (Visio/Erwin, MS Office, MS Project)
Data Engineer
Dallas, TX jobs
Must be local to TX
Data Engineer - SQL, Python and Pyspark Expert (Onsite - Dallas, TX)
Data Engineer with strong proficiency in SQL, Python, and PySpark to support high-performance data pipelines and analytics initiatives. This role will focus on scalable data processing, transformation, and integration efforts that enable business insights, regulatory compliance, and operational efficiency.
Key Responsibilities
Design, develop, and optimize ETL/ELT pipelines using SQL, Python, and PySpark for large-scale data environments
Implement scalable data processing workflows in distributed data platforms (e.g., Hadoop, Databricks, or Spark environments)
Partner with business stakeholders to understand and model mortgage lifecycle data (origination, underwriting, servicing, foreclosure, etc.)
Create and maintain data marts, views, and reusable data components to support downstream reporting and analytics
Ensure data quality, consistency, security, and lineage across all stages of data processing
Assist in data migration and modernization efforts to cloud-based data warehouses (e.g., Snowflake, Azure Synapse, GCP BigQuery)
Document data flows, logic, and transformation rules
Troubleshoot performance and quality issues in batch and real-time pipelines
Support compliance-related reporting (e.g., HMDA, CFPB)
Required Qualifications
6+ years of experience in data engineering or data development
Advanced expertise in SQL (joins, CTEs, optimization, partitioning, etc.)
Strong hands-on skills in Python for scripting, data wrangling, and automation
Proficient in PySpark for building distributed data pipelines and processing large volumes of structured/unstructured data
Experience working with mortgage banking data sets and domain knowledge is highly preferred
Strong understanding of data modeling (dimensional, normalized, star schema)
Experience with cloud-based platforms (e.g., Azure Databricks, AWS EMR, GCP Dataproc)
Familiarity with ETL tools, orchestration frameworks (e.g., Airflow, ADF, dbt)
Snowflake Data Engineer
Durham, NC jobs
Experience in developing and proficient in SQL and knowledge on Snowflake cloud computing environments
Knowledge on Data warehousing concepts and metadata management Experience with data modeling, Data lakes
multi-dimensional models and data dictionaries
Hands-on experience with Snowflake features like Time Travel and Zero-Copy Cloning. Experience in query performance
tuning and cost optimization in a cloud data platform
Knowledge in Snowflake warehousing, architecture, processing and administration , DBT , Pipeline
Hands-on experience on PLSQL Snowflake
•Excellent personal communication, leadership, and organizational skills.
•Should be well versed with various Design patterns
Knowledge of SQL database is a plus
Hands-on Snowflake development experience is must
Work with various cross-functional groups, tech leads from other tracks
Need to work with team closely and guide them technically/functionally Must be a team player with good attitude
Sr. Data Engineer (SQL+Python+AWS)
Saint Petersburg, FL jobs
looking for a Sr. Data Engineer (SQL+Python+AWS) to work on a 12+ Months, Contract (potential Extension or may Convert to Full-time) = Hybrid at St. Petersburg, FL 33716 with a Direct Financial Client = only on W2 for US Citizen or Green Card Holders.
Notes from the Hiring Manager:
• Setting up Python environments and data structures to support the Data Science/ML team.
• No prior Data Science or Machine Learning experience required.
• Role involves building new data pipelines and managing file-loading connections.
• Strong SQL skills are essential.
• Contract-to-hire position.
• Hybrid role based in St. Pete, FL (33716) only.
Duties:
This role is building and maintaining data pipelines that connect Oracle-based source systems to AWS cloud environments, to provide well-structured data for analysis and machine learning in AWS SageMaker.
It includes working closely with data scientists to deliver scalable data workflows as a foundation for predictive modeling and analytics.
• Develop and maintain data pipelines to extract, transform, and load data from Oracle databases and other systems into AWS environments (S3, Redshift, Glue, etc.).
• Collaborate with data scientists to ensure data is prepared, cleaned, and optimized for SageMaker-based machine learning workloads.
• Implement and manage data ingestion frameworks, including batch and streaming pipelines.
• Automate and schedule data workflows using AWS Glue, Step Functions, or Airflow.
• Develop and maintain data models, schemas, and cataloging processes for discoverability and consistency.
• Optimize data processes for performance and cost efficiency.
• Implement data quality checks, validation, and governance standards.
• Work with DevOps and security teams to comply with RJ standards.
Skills:
Required:
• Strong proficiency with SQL and hands-on experience working with Oracle databases.
• Experience designing and implementing ETL/ELT pipelines and data workflows.
• Hands-on experience with AWS data services, such as S3, Glue, Redshift, Lambda, and IAM.
• Proficiency in Python for data engineering (pandas, boto3, pyodbc, etc.).
• Solid understanding of data modeling, relational databases, and schema design.
• Familiarity with version control, CI/CD, and automation practices.
• Ability to collaborate with data scientists to align data structures with model and analytics requirements
Preferred:
• Experience integrating data for use in AWS SageMaker or other ML platforms.
• Exposure to MLOps or ML pipeline orchestration.
• Familiarity with data cataloging and governance tools (AWS Glue Catalog, Lake Formation).
• Knowledge of data warehouse design patterns and best practices.
• Experience with data orchestration tools (e.g., Apache Airflow, Step Functions).
• Working knowledge of Java is a plus.
Education:
B.S. in Computer Science, MIS or related degree and a minimum of five (5) years of related experience or combination of education, training and experience.
Data Engineer
Bloomington, MN jobs
Are you an experienced Data Engineer with a desire to excel? If so, then Talent Software Services may have the job for you! Our client is seeking an experienced Data Engineer to work at their company in Bloomington, MN.
Primary Responsibilities/Accountabilities:
Develop and maintain scalable ETL/ELT pipelines using Databricks and Airflow.
Build and optimize Python-based data workflows and SQL queries for large datasets.
Ensure data quality, reliability, and high performance across pipelines.
Collaborate with cross-functional teams to support analytics and reporting requirements.
Monitor, troubleshoot, and improve production data workflows.
Qualifications:
Strong hands-on experience with Databricks, Python, SQL, and Apache Airflow.
6-10+ years of experience in Data Engineering.
Experience with cloud platforms (Azure/AWS/GCP) and big data ecosystems.
Solid understanding of data warehousing, data modelling, and distributed data processing.
Python Data Engineer- THADC5693417
Houston, TX jobs
Must Haves:
Strong proficiency in Python; 5+ years' experience.
Expertise in Fast API and microservices architecture and coding
Linking python based apps with sql and nosql db's
Deployments on docker, Kubernetes and monitoring tools
Experience with Automated testing and test-driven development
Git source control, git actions, ci/cd , VS code and copilot
Expertise in both on prem sql dbs (oracle, sql server, Postgres, db2) and no sql databases
Working knowledge of data warehousing and ETL Able to explain the business functionality of the projects/applications they have worked on
Ability to multi task and simultaneously work on multiple projects.
NO CLOUD - they are on prem
Day to Day:
Insight Global is looking for a Python Data Engineer for one of our largest oil and gas clients in Downtown Houston, TX. This person will be responsible for building python-based relationships between back-end SQL and NoSQL databases, architecting and coding Fast API and Microservices, and performing testing on back-office applications. The ideal candidate will have experience developing applications utilizing python and microservices and implementing complex business functionality utilizing python.