Post job

Data engineer jobs in Albany, GA - 5,804 jobs

All
Data Engineer
Data Scientist
Requirements Engineer
Applications Support Engineer
ETL Architect
Lead Building Engineer
Software Engineer
Lead Data Analyst
Senior Data Architect
Data Architect
Software Engineer/Architect
Senior Software Engineer
  • Delivery Consultant - Data Architect, AWS Professional Services, AWS Professional Services, AWS Professional Services

    Amazon 4.7company rating

    Data engineer job in Atlanta, GA

    The Amazon Web Services Professional Services (ProServe) team is seeking a skilled Delivery Consultant to join our team at Amazon Web Services (AWS). In this role, you'll work closely with customers to design, implement, and manage AWS solutions that meet their technical requirements and business objectives. You'll be a key player in driving customer success through their cloud journey, providing technical expertise and best practices throughout the project lifecycle. You will lead customer-focused project teams as a technical leader and perform hands-on development of Data & Analytics solutions with exceptional quality. Possessing a deep understanding of AWS products and services, as a Delivery Consultant you will be proficient in architecting complex, scalable, and secure solutions tailored to meet the specific needs of each customer. You'll work closely with stakeholders to gather requirements, assess current infrastructure, and propose effective migration strategies to AWS. As trusted advisors to our customers, providing guidance on industry trends, emerging technologies, and innovative solutions, you will be responsible for leading the implementation process, ensuring adherence to best practices, optimizing performance, and managing risks throughout the project. The AWS Professional Services organization is a global team of experts that help customers realize their desired business outcomes when using the AWS Cloud. We work together with customer teams and the AWS Partner Network (APN) to execute enterprise cloud computing initiatives. Our team provides assistance through a collection of offerings which help customers achieve specific outcomes related to enterprise cloud adoption. We also deliver focused guidance through our global specialty practices, which cover a variety of solutions, technologies, and industries. Key job responsibilities As an experienced technology professional, you will be responsible for: - Leading project teams, designing and implementing end-to-end large-scale, complex, scalable, and secure Data Analytics AWS solutions tailored to customer needs - Providing technical guidance and troubleshooting support throughout project delivery - Collaborating with stakeholders to gather requirements and propose effective Data & Analytics migration and modernization strategies - Acting as a trusted advisor to customers on industry trends and emerging technologies, ensuring compliance with industry standards and governance while aligning data solutions with business strategies. - Sharing knowledge within the organization through mentoring, training, and creating reusable artifacts About the team Diverse Experiences: AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job below, we encourage candidates to apply. If your career is just starting, hasn't followed a traditional path, or includes alternative experiences, don't let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world's most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating - that's why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture - Here at AWS, it's in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth - We're continuously raising our performance bar as we strive to become Earth's Best Employer. That's why you'll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance - We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there's nothing we can't achieve in the cloud. Basic Qualifications - 7+ years of technical specialist, design and architecture experience - 5+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 7+ years of consulting, design and implementation of serverless distributed solutions experience - 5+ years of software development with object oriented language experience - 3+ years of cloud based solution (AWS or equivalent), system, network and operating system experience - 7+ years of external or internal customer facing, complex and large scale project management experience - 5+ years of cloud architecture and solution implementation experience - Bachelor's degree, or 7+ years of professional or military experience Preferred Qualifications - degree in advanced technology, or AWS Professional level certification - Knowledge of AWS services including compute, storage, networking, security, databases, machine learning, and serverless technologies - Knowledge of security and compliance standards including HIPAA and GDPR - Experience in performance optimization and cost management for cloud environments - Experience communicating technical concepts to diverse audiences in pre-sales environments Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit ********************************************************* for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner. The base salary range for this position is listed below. Your Amazon package will include sign-on payments and restricted stock units (RSUs). Final compensation will be determined based on factors including experience, qualifications, and location. Amazon also offers comprehensive benefits including health insurance (medical, dental, vision, prescription, Basic Life & AD&D insurance and option for Supplemental life plans, EAP, Mental Health Support, Medical Advice Line, Flexible Spending Accounts, Adoption and Surrogacy Reimbursement coverage), 401(k) matching, paid time off, and parental leave. Learn more about our benefits at ******************************* . USA, GA, Atlanta - 153,600.00 - 207,800.00 USD annually USA, IL, Chicago - 153,600.00 - 207,800.00 USD annually USA, MA, Boston - 153,600.00 - 207,800.00 USD annually USA, NY, New York - 169,000.00 - 228,600.00 USD annually USA, TX, Austin - 153,600.00 - 207,800.00 USD annually USA, TX, Dallas - 153,600.00 - 207,800.00 USD annually USA, VA, Arlington - 153,600.00 - 207,800.00 USD annually USA, WA, Seattle - 153,600.00 - 207,800.00 USD annually
    $101k-143k yearly est. 5d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Data Scientist Gen AI

    EXL 4.5company rating

    Data engineer job in Atlanta, GA

    Job Title: Data Scientist - GenAI Work Experience: 5+ Years On-site requirement: 4 days per week at Atlanta office We are looking for a highly capable and innovative Data Scientist with experience in Generative AI to join our Data Science Team. You will lead the development and deployment of GenAI solutions, including LLM-based applications, prompt engineering, fine-tuning, embeddings, and retrieval-augmented generation (RAG) for enterprise use cases. The ideal candidate has a strong foundation in machine learning and NLP, with hands-on experience in modern GenAI tools and frameworks such as OpenAI, LangChain, Hugging Face, Vertex AI, Bedrock, or similar. Key Responsibilities: Design and build Generative AI solutions using Large Language Models (LLMs) for business problems across domains like customer service, document automation, summarization, and knowledge retrieval. Fine-tune or adapt foundation models using domain-specific data. Implement RAG pipelines, embedding models, vector databases (e.g., FAISS, Pinecone, ChromaDB). Collaborate with data engineers, MLOps, and product teams to build end-to-end AI applications and APIs. Develop custom prompts and prompt chains using tools like LangChain, LlamaIndex, PromptFlow, or custom frameworks. Evaluate model performance, mitigate bias, and optimize accuracy, latency, and cost. Stay up to date with the latest trends in LLMs, transformers, and GenAI architecture. Required Skills: 5+ years of experience in Data Science / ML, with 1+ year hands-on in LLMs / GenAI projects. Strong Python programming skills, especially in libraries such as Transformers, LangChain, scikit-learn, PyTorch, or TensorFlow. Experience with OpenAI (GPT-4), Claude, Mistral, LLaMA, or similar models. Knowledge of vector search, embedding models (e.g., BERT, Sentence Transformers), and semantic search techniques. Ability to build scalable AI workflows and deploy them via APIs or web apps (e.g., FastAPI, Streamlit, Flask). Familiarity with cloud platforms (AWS/GCP/Azure) and MLOps best practices. Excellent communication skills with the ability to translate technical solutions into business impact. Preferred Qualifications: Experience with prompt tuning, few-shot learning, or LoRA-based fine-tuning. Knowledge of data privacy and security considerations in GenAI applications. Familiarity with enterprise architecture, SDLC, or building GenAI use cases in regulated domains (e.g., finance, insurance, healthcare). For more information on benefits and what we offer please visit us at **************************************************
    $67k-91k yearly est. 3d ago
  • Data Scientist

    Spot Pet Insurance 3.7company rating

    Data engineer job in Miami, FL

    Who we are: Spot Pet Insurance is the fastest growing pet insurance company in North America. Our commitment to an exceptional end-to-end customer experience and our data-driven approach have quickly established us as a leading pet insurance provider. We're dedicated to providing pet parents with peace of mind by offering accessible and comprehensive coverage so their furry companions can lead happier, healthier lives. To demonstrate this, we recently joined forces with MrBeast to find homes for 100 homeless pets and committed to giving each of them pet insurance for life! Along the way, we've created a company culture that allows our employees to thrive, with perks like daily free meals, a pet-friendly office, and ridiculously fun company events every quarter. Our dedication to fostering a positive and rewarding work environment for our team has even earned us a Great Place to Work certification. About the Role: Love Pets? Love AI? Let's Talk. We're looking for a Data Scientist who treats AI like a trusted teammate and thrives in a collaborative, fast-moving environment. If you're already using large language models, AI coding assistants, and automated analysis tools every day, you'll fit right in here. At Spot, we help pet parents protect the animals they love. Your work will make that protection smarter and more personal. Key Responsibilities Team up across the company to find problems worth solving with data. Use AI tools (Claude, ChatGPT, Copilot, and others) to write, debug, and ship code faster. Build predictive models for pricing, claims, fraud detection, and customer behavior. Design experiments and measure what works. You know correlation isn't causation. Run marketing mix modeling to show where our dollars work hardest. Create customer models that help us earn trust and keep it. Build internal tools and data products that help your teammates answer their own questions and make better decisions without waiting on you. Share your findings in ways everyone can understand. Skip the jargon. Keep learning. AI and machine learning move fast. So should you. Required Qualifications Degree in Computer Science, Statistics, Mathematics, Engineering, or a related field. Real experience as a Data Scientist in a fast-paced environment. Strong programming skills in Python, R, and SQL, including data and ML libraries (pandas, NumPy, scikit-learn, TensorFlow, PyTorch). Experience with BigQuery and Databricks. Solid grounding in statistics, hypothesis testing, and experimental design. Daily use of AI assistants for coding, analysis, and problem-solving. We'll ask for examples. Experience building dashboards, self-serve tools, or internal data products for non-technical users. You explain complex ideas clearly About AI Proficiency This matters. We'll ask how you use AI tools in your current work. We want specifics, not buzzwords. If AI isn't already part of how you get things done, this role won't be a good fit. But if you're the type who's always looking for ways to work smarter, we'd love to hear from you. About Location We work together in our downtown Miami office five days a week. This is non-negotiable. We believe the best collaboration happens in person, and this role requires it. What we offer: The opportunity to work on challenging and impactful projects at the intersection of design and data. A collaborative and supportive work environment, recognized as a Great Place to Work. Cell phone allowance of $100 per month Health, dental, and visions benefits Life insurance ClassPass Unlimited PTO Bring your pet to work Your pet insurance is covered (Up to $100) 401k with Company match Annual performance-based bonus
    $64k-95k yearly est. 3d ago
  • Data Scientist

    Zimmerman Advertising 4.2company rating

    Data engineer job in Fort Lauderdale, FL

    The Data Scientist works closely with Retail Technology, Media and Account Services teams to provide predictive modeling of Marketing, Direct & Digital Efforts. We are looking for a motivated Data Scientist and analytical thought leader. This is a rare opportunity to be part of a diverse and newly expanded analytics department and a great fit for a predictive modeler with a desire to impact business results. Responsibilities Apply specialized technical knowledge and expertise to perform reviews relating to the full life cycle of models, information technology applications, or risk management/analysis used across the company. Collaborate and share knowledge with teams across the media organization, as appropriate. Build and maintain relationships with business partners at the manager and staff levels. Use data analysis, mining, and migration techniques for enhanced targeting, audience segmentation, clustering, profiling, and regression analysis Identify digital placement-level strengths and weaknesses across simultaneous campaigns and geographies Develop and maintain internal automated reporting tools, documents, scoring systems, and dashboards for on-going and post-campaign reporting Coordinate cross-functional reviews to discuss region- and campaign-specific findings and actionable recommendations for digital media campaigns built on various CPM, CPC, CPE, and CPA models Identify and facilitate resolution of tagging issues in coordination with Traffic and Production teams focused on site-side tracking, reporting, and implementation Provide client-facing/non-technical recommendations and insights, both in a written and verbal manner, that provide understandable and actionable optimizations. Work with Media, Strategic Intelligence and Account Services teams to develop measurement plans to deliver on campaign and client objectives Requirements Bachelor's degree in related field 2-3 years experience in data and analytics field Must demonstrate the ability to successfully develop and run analytics (scripts) using specialized tools and platforms, specifically, R, Python, SQL, and/or SAS. Experience applying data synthesis, mining and regression techniques for enhanced targeting, audience segmentation, clustering, profiling, and insightful recommendations Advanced knowledge of Microsoft Excel General understanding of digital advertising, digital media strategy, ad placement type, placement-level insight, and standard media metrics Experience with data orchestration tools such as Annalect Omni Excellent verbal, written and interpersonal communication skills Ability to work independently and as part of a team Ability to manage multiple projects simultaneously while meeting deadlines Regression modeling focusing on maximizing yield while measuring the diminishing returns of ad spend at scale for thousands of locations. Data storytelling and presentation
    $57k-82k yearly est. 3d ago
  • Global Data Partnerships Leader

    Matia Inc.

    Data engineer job in Miami, FL

    A tech company specializing in data management seeks a Head of Partnerships in Miami to define their global partnerships strategy and manage partner relationships. Ideal candidates will have extensive experience in the data ecosystem and partnerships, particularly in startups. The role includes building a team from the ground up, establishing a partner-driven pipeline, and collaborating across functional teams. This position is an opportunity to shape the company's partnership function and drive significant revenue impact. #J-18808-Ljbffr
    $77k-108k yearly est. 3d ago
  • Data Scientist

    Parker's Kitchen 4.2company rating

    Data engineer job in Savannah, GA

    We are looking for a Data Scientist with expertise in optimization and forecasting to help improve how we manage labor, staffing, and operational resources across our retail locations. This role is critical in building models and decision-support tools that ensure the right people, in the right place, at the right time - balancing customer service, efficiency, and cost. You will work closely with Operations, Finance, and Store Leadership teams to deliver practical solutions that improve labor planning, scheduling, and demand forecasting. The right candidate will be confident, resourceful, and excited to own both the technical and business-facing aspects of applying data science in a fast-paced retail environment. Responsibilities Build and maintain forecasting models (time-series, machine learning, and statistical) for sales and transactions. Develop and deploy optimization models (linear/mixed-integer programming, heuristics, simulation) to improve workforce scheduling and labor allocation. Partner with operations and finance to translate forecasts into actionable staffing and labor plans that reduce costs while maintaining service levels. Build dashboards and automated tools to track forecast accuracy, labor KPIs, and staffing effectiveness. Provide insights and “what-if” scenario modeling to support strategic workforce and budget planning. Knowledge, Skills, And Abilities Strong foundation in forecasting techniques (time-series models, regression, machine learning) and optimization methods (linear/mixed-integer programming, heuristics, simulation). Proficiency in Python or R for modeling and analysis, along with strong SQL skills for working with large-scale datasets. Knowledge of statistics, probability, and applied mathematics to support predictive and prescriptive modeling. Experience building and deploying predictive models, optimization tools, and decision-support solutions that drive measurable business outcomes. Strong data storytelling and visualization skills using tools such as Power BI, Tableau, or Looker. Ability to translate analytical outputs into clear, actionable recommendations for non-technical stakeholders. Strong collaboration skills with the ability to partner cross-functionally with Operations, Finance, and Store Leadership to drive adoption of data-driven approaches. Ability to work independently and resourcefully, combining technical depth with practical problem-solving to deliver results in a fast-paced environment. Education And Requirements Required: Bachelor's or Master's degree in Data Science, Statistics, Applied Mathematics, Industrial Engineering, Operations Research, or related field. Minimum 2-3 years of professional experience in Data Science or a related area. Strong skills in time-series forecasting (e.g., ARIMA, Prophet, ML-based approaches). Proficiency in optimization techniques (linear programming, integer programming). Strong Python or R programming skills. SQL expertise for large, complex datasets. Strong communication skills with the ability to partner with business stakeholders. Preferred Experience in Retail, Restaurant, and/or Convenience Stores a plus. Experience with cloud platforms (Snowflake, AWS, GCP, Azure). Knowledge of BI tools (Tableau, Power BI, Looker). Physical Requirements Prolonged periods sitting/standing at a desk and working on a computer Must be able to lift up to 50 pounds Parker's is an equal opportunity employer committed to hiring a diverse workforce and sustaining an inclusive culture. Parker's does not discriminate on the basis of disability, veteran status or any other basis protected under federal, state, or local laws.
    $73k-100k yearly est. 2d ago
  • Data Scientist

    Net2Source (N2S

    Data engineer job in Miami, FL

    • Exhibits expertise across multiple products and an ability to build and maintain close relationships with stakeholders to drive efficiencies • Ability to thoroughly understand complex business and technical issues and influence decision making • Superior verbal and written communications skills • Consistently uses communications skills to influence outcomes • Ability to influence others without authority to get things done in a timely fashion • Ability to balance multiple priorities and meet deadlines • Strong knowledge of agile/scrum methodology and user centric design; may be certified product owner; may collaborate with PMO/scrum master on project methodology to highlight issues and help as needed for product lifecycle
    $62k-91k yearly est. 4d ago
  • Senior Data Engineer

    Toorak Capital Partners

    Data engineer job in Tampa, FL

    Company: Toorak Capital Partners is an integrated correspondent lending and table funding platform that acquires business purpose residential, multifamily and mixed-use loans throughout the U.S. and the United Kingdom. Headquartered in Tampa, FL., Toorak Capital Partners acquires these loans directly from a network of private lenders on a correspondent basis. Summary: The role of the Lead Data Engineer is to develop, implement, for building high performance, scalable data solution to support Toorak's Data Strategy Lead Data architecture for Toorak Capital. Lead efforts to create API framework to use data across customer facing and back office applications. Establish consistent data standards, reference architectures, patterns, and practices across the organization for both OLTP and OLAP (Data warehouse, Data Lake house) MDM and AI / ML technologies Lead sourcing and synthesis of Data Standardization and Semantics discovery efforts turning insights into actionable strategies that will define the priorities for the team and rally stakeholders to the vision Lead the data integration and mapping efforts to harmonize data. Champion standards, guidelines, and direction for ontology, data modeling, semantics and Data Standardization in general at Toorak. Lead strategies and design solutions for a wide variety of use cases like Data Migration (end-to-end ETL process), database optimization, and data architectural solutions for Analytics Data Projects Required Skills: Designing and maintaining the data models, including conceptual, logical, and physical data models 5+ years of experience using NoSQL systems like MongoDB, DynamoDB and Relational SQL Database systems (PostgreSQL) and Athena 5+ years of experience on Data Pipeline development, ETL and processing of structured and unstructured data 5+ years of experience in large scale real-time stream processing using Apache Flink or Apache Spark with messaging infrastructure like Kafka/Pulsar Proficiency in using data management tools and platforms, such as data cataloging software, data quality tools), and data governance platforms Experience with Big Query, SQL Mesh(or similar SQL-based cloud platform). Knowledge of cloud platforms and technologies such as Google Cloud Platform, Amazon Web Services. Strong SQL skills. Experience with API development and frameworks. Knowledge in designing solutions with Data Quality, Data Lineage, and Data Catalogs Strong background in Data Science, Machine Learning, NLP, Text processing of large data sets Experience in one or more of the following: Dataiku, DataRobot, Databricks, UiPath would be nice to have. Using version control systems (e.g., Git) to manage changes to data governance policies, procedures, and documentation Ability to rapidly comprehend changes to key business processes and the impact on overall Data framework. Flexibility to adjust to multiple demands, shifting priorities, ambiguity, and rapid change. Advanced analytical skills. High level of organization and attention to detail. Self-starter attitude with the ability to work independently. Knowledge of legal, compliance, and regulatory issues impacting data. Experience in finance preferred.
    $72k-99k yearly est. 5d ago
  • Epic Inpatient HIM OpTime Application Support Engineer - 6079187

    Accenture 4.7company rating

    Data engineer job in Miami, FL

    Accenture Flex offers you the flexibility of local fixed-duration project-based work powered by Accenture, a leading global professional services company. Accenture is consistently recognized on FORTUNE's 100 Best Companies to Work For and Diversity Inc's Top 50 Companies For Diversity lists. As an Accenture Flex employee, you will apply your skills and experience to help drive business transformation for leading organizations and communities. In addition to delivering innovative solutions for Accenture's clients, you will work with a highly skilled, diverse network of people across Accenture businesses who are using the latest emerging technologies to address today's biggest business challenges. You will receive competitive rewards and access to benefits programs and world-class learning resources. Accenture Flex employees work in their local metro area onsite at the project, significantly reducing and/or eliminating the demands to travel. Key Responsibilities: Epic resource will be responsible to implement, build, maintain and optimize Epic integration systems leveraging Epic skills. They have knowledge of the Epic EMR software, operations and workflow, and work closely with the project team's clinical leaders to translate business needs into EMR functionality and Enhancements. Resource is expected to have Epic knowledge and should have prior experience in working with various interfaces and related integrations. Resource is expected to be able to work with clinics to identify gaps, provides mutually agreeable solutions to close workflow gaps. Involves design, building, testing, and implementation of Epic integration application systems. Works with clinicians to create or adapt written protocols. Resource should able to troubleshoot the issues and provide solutions to the prevailing issues Qualification Basic Qualifications: * Minimum 5 years of work experience * Minimum 5 years Healthcare and EHR experience, with a focus on Epic * Epic Inpatient certification required * Strong hands on implementation experience in Inpatient modules like HIM and OpTime * Strong understanding of Inpatient workflows, clinical operations, and IT strategy * Experience contributing to EHR implementation plans scope and timelines * Excellent interpersonal skills with the ability to manage sensitive and confidential information with professionalism * Ability to establish and maintain effective working relationships with diverse groups of client, team members, managers, and vendors. * High School Diploma or GED Preferred Qualifications: * Epic Ambulatory experience Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired as set forth below. We accept applications on an on-going basis and there is no fixed deadline to apply. Information on benefits is here. Role Location Hourly Salary Range California $47.85 to $57.85 Cleveland $47.85 to $57.85 Colorado $47.85 to $57.85 District of Columbia $47.85 to $57.85 Illinois $47.85 to $57.85 Maryland $47.85 to $57.85 Massachusetts $47.85 to $57.85 Minnesota $47.85 to $57.85 New York/New Jersey $47.85 to $57.85 Washington $47.85 to $57.85 Locations
    $47.9-57.9 hourly 3d ago
  • Preconstruction Engineer

    Ortega Construction Company

    Data engineer job in Miami, FL

    🚧 Now Hiring: Preconstruction Engineer 🚧 🏗️ Preconstruction Engineer | Commercial Construction 📍 Miami, FL Ortega Construction is a multifamily general contractor with a growing pipeline of work and is seeking a motivated Preconstruction Engineer to join and support our Preconstruction team on commercial and high-rise projects. Role Description: This role is ideal for someone early in their career who wants real exposure to learning the preconstruction process from start to finish; from takeoffs, subcontractor coordination, budgeting to project contract and handoff. You'll work closely with the entire Preconstruction Team and key members of Operations; your input will help set projects up for success before project groundbreaking. Responsibilities: Perform quantity takeoffs, and assist with cost estimates Support subcontractor outreach, scope reviews, and bid leveling Assist with conceptual budgeting and value engineering Review drawings and specifications for completeness and risk Help prepare proposals and preconstruction deliverables Participating in Precon-to-Operations handoff meetings Qualifications: 1-3 years of industry experience or a Bachelor's degree in Construction Management, Architecture, Engineering or similar Degree Programs in lieu of experience. Commercial Experience Preferred but not required: Multi-Family Hi-Rise and Mid-Rise (Apartments/Condominiums) Mixed Use Facilities Higher Education Hospitality Charter Schools South Florida Market and subcontractor knowledge Familiarity and/or willingness to learn and become proficient with the following technology: On Screen Takeoff (OST) Bluebeam/Adobe SmartBid software RS Means Strong Excel, Word and Power Point. Detail-oriented, organized, follow-up skills and comfortable working with deadlines Good communication and writing skills Fluent in English is required. Featured Benefits: Medical Insurance Phone and Wellness Reimbursement 401(k) Retirement Plan w/ matching Generous Paid Time Off (PTO) Paid Company Holidays Voluntary Dental & Vision Insurance
    $66k-90k yearly est. 4d ago
  • Applied AI Engineer

    Propy Inc.

    Data engineer job in Miami, FL

    Who We Are Propy is revolutionizing the real estate industry by building the world's first AI-powered Title and Escrow platform onchain. We have processed over $5B in transactions, and we are on a mission to make closing on a home as easy as buying a stock. We combine blockchain for security with advanced AI to automate the heavy lifting of closing documents. We aren't just "using" AI; we are building the infrastructure that allows AI agents to securely manage escrow, eliminate fraud, and run 24/7. The Role We are looking for a pragmatic Applied AI Engineer to join our engineering team. The role is not about training models and does not involve academic Machine Learning research. It is about building the rails that make AI usable in a high-stakes financial environment. You will bridge the gap between our robust C#/.NET architecture and the probabilistic world of LLMs. The Challenge Title and Escrow is a document-heavy industry with zero room for error. Your mission is to use AI to clean up the messiness of real-world real estate data. You will solve problems like: Structured Data Extraction: Converting messy, unstructured data (like emails, PDFs, documents) from various sources into strictly validated JSON schemas with as close to 100% accuracy as possible. Escrow Automation: Designing workflows that reduce human intervention by 50% by intelligently routing tasks based on AI analysis. Fraud Detection: Implementing deterministic logic checks on bank and financial documents to detect fraud patterns before they happen. What You'll Do Engineer the Integration: Writing production-grade code that interacts with external AI APIs "Prompt Engineering" as Code: You won't just write prompts; you will version, test, and optimize them. You will define strict schemas to ensure the AI speaks the language of our internal tools. Orchestrate & Validate: Help in building the logic that parses AI responses, validates them against our database (MongoDB), and flags inconsistencies before they reach the user. Full-Stack Implementation: Work to visualize AI-aided services and data for user review and approval. Collaborate: Work closely with the other senior engineers and product owners to translate complex "Title & Escrow" schemas into technical constraints that an AI can understand. What You'll Bring Developer DNA: You are a software engineer first. You have strong experience in Python (C# / .NET is an advantage) and understand programming in depth. Applied AI Experience: You have integrated LLMs into applications via API. Have experience with not only models but also AI frameworks. Experience with workflows, AI agent building and orchestration. You understand context windows, token limits, temperature, and guardrails. Data Handling: Experience with handling complex data structures. The "Glue" Mindset: You enjoy writing the code that connects different services ( like the AWS, AI APIs, and Database) to make a seamless features. Collaborative Autonomy: You will own the AI domain, but you won't be on an island. You will be embedded in a senior engineering team that supports you with architecture, code reviews, and best practices. Nice to Have Experience with AWS infrastructure. Familiarity with the US Real Estate, Title, or Escrow process. What We Offer Working in a transparent environment which focuses on solving problems and getting things done. The opportunity to work with very smart and driven people. The ability to grow your talents and career in a high-growth sector. A remuneration package that is based on the candidate's motivation, skills, and experience. How to Apply Please submit your resume to this job ad along with a portfolio of your AI-related experience, GitHub account and anything else you find applicable.
    $66k-90k yearly est. 3d ago
  • Senior GCP Data Architect

    Zensar Technologies 4.3company rating

    Data engineer job in Atlanta, GA

    Looking for a workplace where people realize their full potential, are recognized for the impact they make, and enjoy the company of the peers they work with? Welcome to Zensar! Read on for more details on the role and about us. Job Title: Senior Data Engineer / Data Architect Location: Atlanta, GA Overview of the Role The Senior Data Engineer / Data Architect will play a key role in designing, building, and optimizing modern data platforms and pipelines that support enterprise‑level analytical, personalization, and operational use cases. This role involves architecting scalable data lake solutions, modernizing legacy data processes, enabling identity resolution, orchestrating cloud‑based ETL workflows, and ensuring high standards of data quality, governance, and compliance. The ideal candidate brings hands‑on expertise in Google Cloud Platform (GCP), Big Query, Airflow, Python, and enterprise data migration, along with strong experience supporting marketing, retail, and customer personalization ecosystems. The role requires strong collaboration skills, leadership in cross‑functional environments, and the ability to translate business requirements into scalable technical solutions. Key Responsibilities: Data Architecture & Engineering Architect and develop scalable data lake and data warehouse solutions using GCP (Big Query, Cloud Storage) and other modern cloud technologies. Design end‑to‑end ETL/ELT workflows using Apache Airflow (Cloud Composer), Python, SQL, and Unix Shell scripting. Build, optimize, and maintain high‑performance data models supporting analytics, personalization, and downstream business processes. Implement incremental and real‑time ingestion patterns (including delta loads, streaming, and True Delta‑based updates). Marketing & Personalization Data Ecosystem Integrate enterprise datasets with Adobe Experience Platform (AEP), Customer Journey Analytics (CJA), and Adobe Journey Optimizer (AJO). Implement identity resolution workflows using platforms such as Amperity, ensuring accuracy, governance, and privacy compliance. Develop suppression logic and orchestration workflows that coordinate customer journeys across marketing channels to prevent duplicate targeting. Data Migration & Modernization Lead migrations from legacy platforms (Oracle, DB2, on‑prem systems) to modern GCP‑based architectures. Re-engineer legacy ETL jobs (e.g., DataStage, PL/SQL pipelines) into scalable Python‑ and Airflow‑based cloud workflows. Conduct detailed source‑to‑target mappings, data cleansing, validation, and reconciliation for high‑volume migrations. Data Governance, Compliance & Quality Implement and enforce CCPA, data privacy standards, and security best practices across pipelines and platforms. Ensure high data quality through automated validation, proactive monitoring, and observability dashboards. Establish and maintain data lineage, metadata standards, and domain‑specific governance practices. Production Support & Operational Excellence Provide L3 support for complex data pipelines, ensuring stability, scalability, and optimized performance. Troubleshoot production issues, conduct root‑cause analysis, and implement preventive measures. Collaborate with cross‑functional teams (engineering, analytics, marketing, product) to support ongoing data initiatives. Leadership & Collaboration Work closely with onshore/offshore teams to manage priorities, guide development, and ensure timely delivery. Engage business stakeholders to understand requirements, define KPIs, and shape data solutions that align with strategic objectives. Provide architectural recommendations, technical mentorship, and thought leadership within the data engineering function. 3. Qualifications Required Skills 14+ years of experience in data engineering, data architecture, or related disciplines. Hands‑on experience with Google Cloud Platform (BigQuery, Dataflow, Composer/Airflow, Cloud Storage). Strong programming expertise in Python, SQL, and Unix Shell scripting. Deep understanding of ETL/EL T pipelines, orchestration, and workflow automation. Experience with major databases such as Oracle, DB2, MySQL, Cloud SQL. Proficiency with tools such as Control‑M, Git, JIRA, DBeaver, and cloud development consoles. Strong background in data modeling (conceptual, logical, physical). Experience integrating with marketing platforms such as AEP, CJA, AJO, or Amperity. Preferred Skills Experience in retail, e‑commerce, telecom, or customer analytics domains. Familiarity with Hadoop, Hive, or other big‑data technologies. Understanding of identity resolution workflows and customer 360 datasets. Exposure to ServiceNow, Splunk, Kibana, or similar monitoring platforms. Experience with Agile methodologies and enterprise SDLC practices. Education & Certifications: Bachelor's degree in Computer Science, Information Technology, Engineering, or related field. Certifications in GCP (e.g., GCP Data Engineer) or equivalent cloud certifications are a strong plus. Additional certifications in Agile, data governance, or related areas are a plus. Advantage Zensar We are a digital solutions and technology services company that partners with global organizations across industries to achieve digital transformation. With a strong track record of innovation, investment in digital solutions, and commitment to client success, at Zensar, you can help clients achieve new thresholds of performance. A subsidiary of RPG Group, Zensar has its HQ in India, and offices across the world, including Mexico, South Africa, UK and USA. Zensar is all about celebrating individuality, creativity, innovation, and flexibility. We hire based on values, talent, and the potential necessary to fill a given job profile, irrespective of nationality, sexuality, race, color, and creed. We also put in policies to empower this assorted talent pool with the right environment for growth. At Zensar, you Grow, Own, Achieve, Learn. Learn more about our culture: ***************************************** Ready to #ExperienceZensar? Begin your application by clicking on the ‘Apply Online' button below. Be sure to have your resume handy! If you're having trouble applying, drop a line to ******************.
    $77k-103k yearly est. 3d ago
  • BIM Engineer

    Plateau Excavation, Inc.

    Data engineer job in Kennesaw, GA

    Plateau Excavation is seeking a BIM Engineer to support the planning, coordination, and execution of large-scale sitework, infrastructure, and mission-critical projects across the Southeast. Reporting to the BIM Manager, this role focuses on developing accurate digital models and supporting data-driven workflows that improve constructability, coordination, and field execution. This is an in-office position based in Kennesaw, Georgia. Key Responsibilities Develop and maintain BIM and 3D models for earthwork, grading, utilities, and site logistics Support estimating and preconstruction with model-based quantity takeoffs and analysis Assist the BIM Manager with constructability reviews and design coordination Coordinate BIM data with survey, engineering, project management, and field teams Create and maintain machine control models and digital terrain models (DTMs) Identify and help resolve design conflicts and clashes prior to construction Update models throughout the project lifecycle, including design revisions and as-built conditions Support 4D scheduling and visualization as needed Follow and help implement Plateau's BIM standards, workflows, and documentation Provide technical support to project teams under the direction of the BIM Manager Required Qualifications Bachelor's degree in Civil Engineering, Construction Management, Geomatics, or related field (or equivalent experience) 2+ years of experience in BIM, VDC, or digital construction, preferably in heavy civil or sitework Proficiency with: AutoCAD / Civil 3D Navisworks Revit (as applicable) Trimble, Topcon, or similar construction technology platforms Strong understanding of sitework and civil construction operations Ability to read and interpret civil plans, specifications, and survey data Strong communication and coordination skills Preferred Qualifications Experience with BIM/VDC Modeling Experience on mission-critical, industrial, or large-scale infrastructure projects Familiarity with drone data, point clouds, or reality capture Field experience in heavy civil construction Exposure to scheduling and 4D modeling Why Join Plateau Excavation Direct mentorship and collaboration with Plateau's BIM leadership In-office collaboration with experienced project, survey, and field teams Opportunity to grow within Plateau's digital construction and technology program Competitive compensation and benefits A people-first culture built on safety, quality, and innovation
    $63k-85k yearly est. 1d ago
  • Growth Architect & Revenue Engine Lead

    Medium 4.0company rating

    Data engineer job in Miami, FL

    A technology solutions company in the US seeks a Chief Growth Officer to lead the entire revenue strategy. The role involves owning sales strategies, building an outbound organization, and creating a predictable deal pipeline. The ideal candidate has over 8 years of experience in technology sales, a proven track record of closing large deals, and comfort in high-growth environments. This position offers competitive compensation, equity participation, and collaboration with the CEO and a high-performance team. #J-18808-Ljbffr
    $91k-136k yearly est. 1d ago
  • Senior Developer (Sitecore XM Cloud/Next.js) - Full-time - Atlanta, GA

    Elevate Digital 4.7company rating

    Data engineer job in Atlanta, GA

    We're looking for a senior-level developer to build, enhance, and support modern digital experiences on a headless Sitecore XM Cloud platform. This person will work across the full software lifecycle-design through production support-helping deliver scalable, secure, high-performing frontend solutions. You'll also act as a technical go-to within the team and contribute to shared standards and continuous improvement. What you'll do Own end-to-end development tasks for multiple applications and services, including solution design, implementation, estimation, debugging, unit testing, and technical documentation. Serve as a trusted technical expert during discovery, analysis, development, and release phases. Partner with engineering leadership and stakeholders to deliver well-architected, enterprise-grade solutions that align with current priorities and future growth. Define and drive testing approaches, building thorough test plans, and ensuring appropriate stress/load coverage for high-impact areas. Review code and components for adherence to architecture, quality, and engineering standards. Build in observability from day one, including logging, monitoring, and proactive alerting to reduce downtime. Maintain and enhance existing platforms, sites, and databases; identify improvements that increase stability and performance. Provide advanced production support as an escalation point, including participation in an on-call rotation for critical events outside normal hours. Diagnose and resolve complex live issues with persistence and clear follow-through until fully closed. Evaluate and recommend third-party tools or vendor solutions, leading selection and adoption when needed. Promote knowledge sharing through mentoring, documentation, and team collaboration. Apply secure development practices and remediate issues surfaced by security scans or audits. Contribute to other related responsibilities as business needs evolve. Build and connect Next.js (14/15+) frontends to Sitecore XM Cloud using Sitecore JSS, GraphQL, and TypeScript. Configure and improve CI/CD workflows for frontend delivery, including Vercel pipelines and global environment tuning. Use modern rendering patterns (SSG, SSR, ISR) to achieve fast, reliable UX and strong Google PageSpeed outcomes. Architect personalization and real-time content/data experiences with Sitecore CDP/Personalize and Sitecore Search. Work closely with product, UX, backend, and DevOps partners to troubleshoot and solve cross-system problems. Organizational impact Delivers daily results that materially affect operational outcomes within your domain. Works independently with minimal oversight and strong ownership of deliverables. May lead projects or key processes within your area of responsibility. Regularly reviews or coaches colleagues at earlier career stages. Leadership and talent contribution Provides practical guidance, mentoring, and technical coaching to teammates. May coordinate small projects, delegating tasks and reviewing output to ensure quality. Qualifications and experience Broad technical grounding gained through a university degree (or equivalent hands-on expertise). High school diploma or GED required. Typically 4-6 years of relevant professional development experience. Sitecore expertise: 5+ years building on Sitecore, including 2-3 years delivering production work on Sitecore XM Cloud (not legacy XM/XP). Headless/frontend depth: Advanced capability in Next.js, React, TypeScript, and GraphQL. Deployment & DevOps: Demonstrated success with Vercel deployments, performance tuning, and CI/CD platforms such as Azure DevOps or GitHub Actions. Core engineering skills: Mastery of Sitecore JSS, modern web architecture, and cloud-native development patterns.
    $90k-117k yearly est. 3d ago
  • Engineer

    Mindlance 4.6company rating

    Data engineer job in Tuscaloosa, AL

    Job Title: Engineer Duration: 2+ years Pay range: $32-$35/hr on w2 without benefits This position requires 0-5 years of related experience and an accredited bachelor's degree in Engineering. The successful candidate will be responsible for applying engineering processes, design criteria, and software applications to design and construct power related systems. Supports estimate preparation, develops design options, and prepares construction specifications. The successful candidate will be responsible for managing assigned work in a territory in the client area. The primary job duties will be centered around providing safe, reliable, timely and economical electric service to residential, commercial, and industrial customers. The successful candidate will meet with customers to determine their electric service needs and engineer jobs to provide service. This position will also be expected to assist with engineering jobs to improve reliability and perform routine maintenance on the electrical distribution system. Additionally, there may also be opportunities to engineer large projects such as roadway relocations, planned infrastructure improvements, and residential subdivisions. This position will also participate in after-hours and off system storm restoration efforts. Job Experience and Education • A bachelor's degree in engineering from an ABET accredited university/college is required. • Experience in engineering, design, and construction of electrical distribution systems is preferred but not required. Knowledge, Skills & Abilities • Knowledge and experience in computer applications such as CYME, JETS, CSS, DistGIS, SOCKET, ADMS/OMS, etc. preferred • Knowledge of distribution design, standards, and practice preferred • Excellent oral and written communication skills • Ability to handle multiple projects simultaneously and set priorities • Proven experience in creative problem solving • Effective time management skills • Ability to make sound engineering decisions during emergency situations • Ability to exercise a high level of leadership • Knowledge of and ability to apply safety and health rules • Knowledge of company policies/procedures, NESC, and NEC requirements • Ability to go out of area/state for storm restoration activities Behavioral Attributes • Ability to represent the Company in a professional manner. • Utilize sound engineering practices in providing reliable, economical and timely electric service to customers. • Establish and maintain excellent customer relations. • Take ownership of assigned work. Other • It is required that the successful candidate for this position lives in or relocates within a reasonable commute to the primary work location in client.
    $32-35 hourly 4d ago
  • DATA SCIENTIST

    Reliant Technology 3.7company rating

    Data engineer job in Huntsville, AL

    Ignite is an ISO 9001:2015 and CMMI Services Level 3 certified, Service-Disabled Veteran-Owned Small Business (SDVOSB), headquartered in Huntsville, AL. By design, Ignite is a provider of professional services to customers in educational, federal, and commercial industries and in every action seeks to be the preeminent provider within this business space. Ignite upholds our values of competency, collaboration, innovation, reliability, and results through everything we do. Ignite is currently seeking a driven, detail-oriented Data Scientist to join our team supporting the Space Force Concepts and Technology Delta and Space Warfighter Analysis Center in Colorado Springs, Co. This position can be filled in either Colorado Springs, CO or Huntsville, AL. In this role you will develop and utilize data analysis tools to create complex data sets for evaluating the future of warfare in space. Job Requirements Responsibilities include, but are not limited to: * Development of models and simulation runs for logistics analysis. * Development of software tools for data ingestion, storage, processing and analysis. * Development of advanced Machine Learning (ML) and Artificial Intelligence (AI) architectures and use cases in support of customer needs. * Development of presentation materials and demonstrations for internal and external stakeholders. * Development of Open-Source Intelligence tools * Development of advanced visualizations for data and network analyses Job Requirements and Qualifications: * One to five years of experience in data science or related field. * Strong Python knowledge and familiarity * Experience using FastAPI and ETL pipelines * Experience with ReactJS /JavaScript * Git/GitLab experience with version control * Data visualization using Matplotlib/Seaborn * OSINT Tooling (Playwright/BeautifulSoup) * Familiarity with MongoDB or other NoSQL schema * Microsoft Office skills * Basic professionalism Security Clearance Requirements: Must have an active Secret Security Clearance or the ability to obtain one. Top Secret Clearance is a plus. Education Requirements: * Bachelor's of Science or Masters of Science in Data Science, Data Engineering, Statistics, Software Engineer, Artificial Intelligence or a related field. Other Requirements: Must be a US citizen and be able to obtain and hold an active Security Clearance Salary Range: $90k - $140k We are equal opportunity/affirmative action employers, committed to diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity, sexual orientation, national origin, disability, or protected veteran status, or any other protected characteristic under state or local law. Accommodation Request: If you are a qualified individual with a disability or are a disabled veteran and are unable or limited in your ability to use or access our Careers sites as a result of your disability, you have the right to receive assistance in completing the application process. Please send your request to **********************
    $90k-140k yearly 60d+ ago
  • Data Engineer

    Creative Financial Staffing 4.6company rating

    Data engineer job in Orlando, FL

    Data Engineer Salary: $100,000-$115,000 + Bonus Benefits: Competitive Healthcare Plans, 34 Days of PTO, Onsite Gym, Wellness Programs, 401(k) match, etc. Summary of the Data Engineer: The Data Engineer will design, build, and maintain scalable data pipelines and platforms that support enterprise reporting, analytics, and data‑driven decision‑making. This role works closely with IT, data, and business teams to ensure data is accurate, accessible, secure, and ready for use across the organization. Here are a few reasons to apply: Outstanding benefits, including an onsite gym and 34 days of PTO. Leadership invested heavily in both the business and its people, fueling continued growth. Brand new, state-of-the-art office is currently being built. Key Responsibilities of the Data Engineer: Support and enhance the Enterprise Data Warehouse, ensuring reliable and accurate data flows. Design and implement on‑prem or cloud‑based data platforms, integrating enterprise data sources. Build and maintain ETL/ELT pipelines to ingest, transform, and optimize data. Develop and maintain data models for reporting, analytics, and advanced use cases (AI/ML). Implement data governance, quality, security, and compliance best practices. Enable self‑service analytics and real‑time data processing using modern data technologies. Collaborate with IT, analytics, and business teams to translate data needs into technical solutions. Mentor junior team members and contribute to data architecture and modernization initiatives. Preferred Experience of the Data Engineer: Degree in Information Systems, Data Science, or related field. 5+ years of experience in data engineering, data architecture, or related roles. Strong experience with Azure‑based data platforms and lakehouse architectures. Proficiency in ETL/ELT tools (SSIS, Azure Data Factory) and SQL‑based databases. Hands‑on experience with big data and streaming technologies (Databricks, Spark, Kafka, etc.). Familiarity with BI tools such as Power BI, Tableau, or Looker. Working knowledge of data governance, security, and compliance frameworks. Experience using Jira or similar ticketing systems. Bonus Experience of the Data Engineer: Experience with Guidewire InsuranceSuite data models (PolicyCenter, BillingCenter, ClaimCenter) preferred. Experience with Data Mesh architecture. Exposure to machine learning / MLOps. Azure Data Engineer certification. #LI-HM1 #INJAN2026 a { text-decoration: none; color: #464feb; } tr th, tr td { border: 1px solid #e6e6e6; } tr th { background-color: #f5f5f5; } a { text-decoration: none; color: #464feb; } tr th, tr td { border: 1px solid #e6e6e6; } tr th { background-color: #f5f5f5; } a { text-decoration: none; color: #464feb; } tr th, tr td { border: 1px solid #e6e6e6; } tr th { background-color: #f5f5f5; } a { text-decoration: none; color: #464feb; } tr th, tr td { border: 1px solid #e6e6e6; } tr th { background-color: #f5f5f5; } a { text-decoration: none; color: #464feb; } tr th, tr td { border: 1px solid #e6e6e6; } tr th { background-color: #f5f5f5; }
    $83k-117k yearly est. 1d ago
  • ETL Architect

    Healthplan Services 4.7company rating

    Data engineer job in Tampa, FL

    HealthPlan Services (HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries. Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide. Job Description Position: ETL Architect The ETL Architect will have experience delivering BI solutions with an Agile BI delivery methodology. Essential Job Functions and Duties: Develop and maintain ETL jobs for data warehouses/marts Design ETL via source-to-target mapping and design documents that consider security, performance tuning and best practices Collaborate with delivery and technical team members on design and development Collaborate with business partners to understand business processes, underlying data and reporting needs Conduct data analysis in support of ETL development and other activities Assist with data architecture and data modeling Preferred Qualifications: 12+ years of work experience as Business Intelligence Developer Work experience with multiple database platforms and BI delivery solutions 10+ years of experience with End to End ETL architecture, data modeling BI and Analytics data marts, implementing and supporting production environments. 10+ years of experience designing, building and implementing BI solutions with modern BI tools like Microstrategy, Microsoft and Tableau Experience as a Data Architect Experience delivering BI solutions with an Agile BI delivery methodology Ability to communicate, present and interact comfortably with senior leadership Demonstrated proficiency implementing self-service solutions to empower an organization to generate valuable actionable insights Strong team player Ability to understand information quickly, derive insight, synthesize information clearly and concisely, and devise solutions Inclination to take initiative, set priorities, take ownership of assigned projects and initiatives, drive for results, and collaborate to achieve greatest value Strong relationship-building and interpersonal skills Demonstrated self-confidence, honesty and integrity Conscientious of Enterprise Data Warehouse Release management process; Conduct Operations readiness and environment compatibility review of any changes prior to deployment with strong sensitivity around Impact and SLA Experience with data modeling tools a plus. Expert in data warehousing methodologies and best practices required. Ability to initiate and follow through on complex projects of both short and long term duration required. Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required. Proactive recommendation for improving the performance and operability of the data warehouse and reporting environment. Participate on interdepartmental teams to support organizational goals Perform other related duties and tasks as assigned Experience facilitating user sessions and gathering requirements Education Requirements: Bachelors or equivalent degree in a business, technical, or related field Additional Information All your information will be kept confidential according to EEO guidelines.
    $84k-105k yearly est. 60d+ ago
  • Data Scientist I

    Optimal Solutions and Technologies 3.3company rating

    Data engineer job in Orlando, FL

    Optimal Solutions & Technologies (OST, Inc.) is focused on excellence. We specialize in providing Management Consulting, Information Technology, and Research Development and Engineering services. The fundamental distinction of the OST team is its business knowledge in both the public and private sectors. We serve the aerospace & transportation, association & nonprofit, defense, education, energy, financial, healthcare, and technology & telecommunications industries. OST is successful because we listen to our clients, we learn from our clients, and we know our clients. Data Scientist I Description of specific duties in a typical workday for this position: * Applies statistics and data science methods to develop insights, prototypes, and decision-support analyses that improve CPE-STRI program performance and modernization outcomes as well as supporting Agile Solution Implementation for providing proactive metrics analysis (e.g. Velocity and Capacity Forecasting) * Works with senior data scientists and engineers to prepare data, run experiments, and communicate results to technical and non-technical stakeholders. * Key Responsibilities include: * Perform exploratory data analysis, feature engineering, and baseline model development. * Develop reproducible notebooks/scripts and document assumptions, methods, and results. * Support model validation, bias checks, and monitoring plans. * Collaborate with engineers to transition prototypes into deployable pipelines where appropriate. Requirements (Years of experience, Education, Certifications): * 5 years of experience with BA/BS (or equivalent). * Experience with Python/R and common statistical/ML libraries; comfortable communicating analytic results. * Familiarity with data ethics, validation, and reproducibility practices. * Bachelor's degree in Data Science, Statistics, Mathematics, Computer Science, Engineering, or a related field * Active SECRET strongly desired; ability to obtain and maintain SECRET (or higher) required. Task-order DD254 may require higher clearance (e.g., TS/SCI). Nice to Have (skills that are not required, but nice to have): * Exposure to operational datasets and performance metrics in DoD or large enterprise contexts * Experience with visualization tools to communicate results * Experience working with protected data and RBAC This is a full-time position paying a base salary, full benefits, and has possible bonus potential based on merit and performance. To be considered for this position, please apply online with a resume. OST is an equal opportunity employer. Applicants are considered for positions without regard to race, religion, gender, native origin, age, disability, or any other category protected by applicable federal, state, or local laws.
    $65k-88k yearly est. 13d ago

Learn more about data engineer jobs

How much does a data engineer earn in Albany, GA?

The average data engineer in Albany, GA earns between $65,000 and $114,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Albany, GA

$86,000
Job type you want
Full Time
Part Time
Internship
Temporary