Data Analyst
Data analyst job in Irving, TX
Title: Data Analyst
Type: Full-Time
About the Role:
Optimize Search Group is seeking a hands-on Data Analyst who can own reporting, dashboards, and business intelligence across sales, service, customer, and operational metrics for our client. This role is highly collaborative, generative in nature, and requires someone who can work directly with executives and cross-functional stakeholders to translate business needs into clean, accurate, and actionable analytics.
You will be responsible for gathering requirements, building mockups, validating data, and ensuring the right data flows into dashboards used for business-critical decisions.
Key Responsibilities
Partner with executives and business stakeholders to understand reporting needs and define analytical requirements.
Develop interactive dashboards, metrics, and visualizations across sales, customer success, support/service cases, product usage, and operational KPIs.
Create mockups and prototypes to validate requirements before full build-out.
Build and optimize SQL queries to pull, clean, transform, and validate datasets.
Ensure accurate data ingestion and pipeline alignment in Snowflake.
Maintain and enhance reporting inside Sigma BI
Own the full lifecycle of dashboard and report creation-requirements, data modeling, build, QA, release, and ongoing improvements.
Improve visibility into customer metrics, SaaS KPIs, funnel analytics, renewal forecasting, and churn/retention indicators.
Troubleshoot data discrepancies, fix schema or join issues, and ensure source-of-truth accuracy across systems.
Present insights and recommendations to leadership in clear, consumable formats.
Required Qualifications
3-6 years of experience as a Data Analyst, BI Analyst, or similar data role.
Strong proficiency in SQL with the ability to write complex joins, CTEs, window functions, and performance-optimized queries.
Experience with Snowflake or similar cloud data warehouses.
Strong data visualization and dashboard development skills.
Experience working directly with executive leadership and multiple business stakeholders.
Strong communication skills, including the ability to gather requirements, explain data concepts, and present findings clearly.
Hands-on experience building KPIs for sales, customer success, service operations, or SaaS metrics.
Comfortable working in a 5-days on-site environment (Irving, TX).
Preferred Qualifications
Experience in a SaaS environment or subscription-based business model.
Experience with Sigma BI (preferred), or advanced proficiency in one or more of the following:
Tableau
Power BI
Qlik
Looker
Other industry BI visualization tools
Experience with generative analysis (building insights from ambiguity or open-ended business questions).
Experience creating dashboards from scratch-mockups, wireframes, and end-to-end execution.
What Makes This Role Exciting
You'll have direct access to executives and decision-makers.
Your dashboards will directly influence sales strategy, customer insights, and operational improvements.
A high-impact environment where quality, accuracy, and creativity are valued.
Opportunity to shape BI standards, tools, and data quality across the organization.
Business System Analyst
Data analyst job in Dallas, TX
Business System Analyst Duration: Contract We are seeking a skilled Business System Analyst to join our team on a 5-month contract basis initially. The selected candidate will have enterprise-wide responsibility for capturing requirements for projects impacting the Product Catalog for video promotions and content. This role involves serving as the Technology Development liaison and primary point of contact for requestors, whether Business or IT, and working across multiple organizations to align business strategies and functional architectures.
Responsibilities:
Maintain and support relationships with key stakeholders, actively managing their expectations and monitoring satisfaction levels.
Engage and capture the needs of a diverse group of stakeholders enterprise-wide.
Document all requirements for requests using approved templates and delivery plans.
Ensure agreed services are delivered to meet requirements and requestor expectations.
Research issues and collaborate with development teams to resolve them.
Recommend business solutions or alternatives as needed.
Act as a liaison to clients and other IT organizations as a subject matter expert on business processes.
Track and monitor requirements traceability to project deliverables.
Provide Project Managers with time and resource estimates for requirements.
Qualifications:
Proven experience as a Business System Analyst or in a similar role.
Strong ability to manage stakeholder relationships and expectations.
Excellent documentation skills using approved templates and tools.
Ability to research and resolve issues effectively in collaboration with development teams.
Experience in recommending business solutions and alternatives.
Strong understanding of business processes and functional architectures.
Proficiency in tracking and monitoring requirements traceability.
Excellent communication and organizational skills.
About PTR Global: PTR Global is a leading provider of information technology and workforce solutions. PTR Global has become one of the largest providers in its industry, with over 5000 professionals providing services across the U.S. and Canada. For more information visit *****************
At PTR Global, we understand the importance of your privacy and security. We NEVER ASK job applicants to:
Pay any fee to be considered for, submitted to, or selected for any opportunity.
Purchase any product, service, or gift cards from us or for us as part of an application, interview, or selection process.
Provide sensitive financial information such as credit card numbers or banking information. Successfully placed or hired candidates would only be asked for banking details after accepting an offer from us during our official onboarding processes as part of payroll setup.
Pay Range: $50 - $70
The specific compensation for this position will be determined by a number of factors, including the scope, complexity and location of the role as well as the cost of labor in the market; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. Our full-time consultants have access to benefits including medical, dental, vision and 401K contributions as well as any other PTO, sick leave, and other benefits mandated by appliable state or localities where you reside or work.
If you receive a suspicious message, email, or phone call claiming to be from PTR Global do not respond or click on any links. Instead, contact us directly at ***************. To report any concerns, please email us at *******************
Power BI Reporting Analyst
Data analyst job in Farmers Branch, TX
Founded in 2008, The Fay Group is a diversified real estate services company offering a complete range of home ownership products and services to include mortgage servicing, property renovations, property management, realty, business purpose lending, and insurance to homeowners, investors and clients nationwide. We consider the people behind those mortgages and work hard to give them the best opportunity to stay in their homes by providing solutions to navigate the challenges of homeownership while working toward their long-term financial goals.
We are currently looking for a Client Reporting Analyst to join our team.
Reporting to the EVP, Asset Management, this position plays a vital role in supporting investor relations by preparing and delivering a comprehensive suite of monthly accounting reports tailored to the needs of assigned clients. This position compiles and validates financial data, ensuring timely and accurate reporting, and maintaining compliance with client-specific requirements and regulatory standards.
This position processes loan transfers and performs detailed reconciliation of servicing transactional data. Additionally, this role also serves as a point of escalation for complex issues, including discrepancies in loan data and reporting anomalies, requiring resolution beyond standard servicing procedures.
The ideal candidate must be able to complete all physical requirements of the job with or without a reasonable accommodation. While performing the duties of this job, the employee is required to sit as well as work at a computer for extended periods of time, utilizing a keyboard and mouse. The employee should be able to communicate by telephone, email, and face-to-face.
Qualifications Include:
Bachelor's degree in Business or related field (or equivalent combination of years of experience with High School Diploma/GED) required
3+ years' experience in Investor Reporting or Investor Accounting required
3+ years' experience with data visualization platforms like PowerBI, Tableau, or similar.
2+ years' experience in the mortgage industry, including servicing, processing rules, and guidelines
Familiarity with custodial account reconciliation and expected cash testing is preferred
Proven ability to work effectively in a fast-paced, deadline-driven environment
Experience with Black Knight MSP preferred
Strong skills in the MS Office Suite with advanced Excel skills for data compilation and analysis
Strong verbal and written communication skills
Strong interpersonal skills
Strong analytical skills
Strong problem-solving, data collection, analysis, and decision-making skills
Solid decision-making abilities coupled with sound judgment
Strong time management skills
Ability to prioritize numerous tasks and manage shifting priorities
Client-focused with strong execution skills and results orientation
High level of precision with attention to detail and consistency
Flexible, open to change, and able to learn new things quickly
Ability to work in a collaborative environment and provide guidance for working groups
Featured Benefits
Medical, Dental, and Vision Insurance
Company Paid Life Insurance
Disability Insurance
Pet Insurance
401(k) Program with Employer Matching
3 Weeks Paid Time Off (PTO)
Paid Holidays
Wellness Initiatives
Employee Assistance Program
Eligible for Hybrid Work Schedule with Remote Flex Days
Compensation
The hiring range for this position is between $75,000.00-$90,000.00 annually
This position is eligible for an annual discretionary bonus
Fay Cares!
The Fay-Constructive Foundation was established to fulfill the philanthropic mission of The Fay Group employees to serve the communities in which they live and work. Our employees make voluntary contributions to the Foundation. Each quarter, their contributions are donated to organizations focused on improving education opportunities, combating poverty, and supporting military service members and first responders.
At Fay, we believe that the best ideas come from having a team that is diverse in backgrounds, experiences, and perspectives. We strive to ensure each of our employees feels valued, respected, and included, and is presented with equal opportunities to be successful. Fay is an equal-opportunity workplace. The Fay Group and affiliated companies participate in E-Verify. For more information, go to *********************
Data Modeler
Data analyst job in Plano, TX
Plano TX- Nearby candidates only
W2 Candidates
Must Have:
5+ years of experience with data modeling, warehousing, analysis & data profiling experience and ability to identify trends and anomalies in the data
Experience on AWS technologies like S3, AWS Glue, EMR, and IAM roles/permissions
Experience with one or more query language (e.g., SQL, PL/SQL, DDL, SparkSQL, Scala)
Experience working with relational database such as Teradata and handling both structured and unstructured datasets
Data Modeling tools (Any of - Erwin, Power Designer, ER Studio)
Preferred / Ideal to have -
Proficiency in Python
Experience with NoSQL, non-relational databases / data stores (e.g., object storage, document or key-value stores, graph databases, column-family databases)
Experience with Snowflake and Databricks
Data Scientist with Python ML/NLP
Data analyst job in Addison, TX
Role: Data Scientist with Python ML/NLP
Yrs. of experience: 10+ Yrs.
Fulltime
Job Responsibilities:
We're looking for a Data Scientist who can be responsible for designing, building and maintaining document capture applications. The ideal candidate will have a solid background in software engineering with experience in building Machine Learning NLP Models and good familiarity with Gen AI Models.
High Level Skills Required
Primary - 7+ years' as Data Scientist or related roles
Bachelor's degree in Computer Science, or a related technical field
Deep understanding and some exposure to new Gen AI Open-source Models
At least 5 years programming experience in software development and Agile process
At least 5 years Python (or equivalent) programming experience to work with ML/NLP models.
Data Scientist (F2F Interview)
Data analyst job in Dallas, TX
W2 Contract
Dallas, TX (Onsite)
We are seeking an experienced Data Scientist to join our team in Dallas, Texas. The ideal candidate will have a strong foundation in machine learning, data modeling, and statistical analysis, with the ability to transform complex datasets into clear, actionable insights that drive business impact.
Key Responsibilities
Develop, implement, and optimize machine learning models to support business objectives.
Perform exploratory data analysis, feature engineering, and predictive modeling.
Translate analytical findings into meaningful recommendations for technical and non-technical stakeholders.
Collaborate with cross-functional teams to identify data-driven opportunities and improve decision-making.
Build scalable data pipelines and maintain robust analytical workflows.
Communicate insights through reports, dashboards, and data visualizations.
Qualifications
Bachelor's or Master's degree in Data Science, Statistics, Computer Science, or a related field.
Proven experience working with machine learning algorithms and statistical modeling techniques.
Proficiency in Python or R, along with hands-on experience using libraries such as Pandas, NumPy, Scikit-learn, or TensorFlow.
Strong SQL skills and familiarity with relational or NoSQL databases.
Experience with data visualization tools (e.g., Tableau, Power BI, matplotlib).
Excellent problem-solving, communication, and collaboration skills.
Business Analyst - Infrastruture
Data analyst job in Coppell, TX
MUST BE US Citizen or Green Card; no third party vendors
5-7+ years of experience as a Business Analyst, preferably in cloud enablement and infrastructure delivery
Experienced with setting up or migrating physical network servers for the applications (in a BA capacity)
Strong knowledge of cloud platforms (AWS, Azure) and cloud service lifecycle management.
Excellent analytical, communication, and stakeholder management skills.
Familiarity with financial tracking for cloud consumption and cost optimization strategies is a plus.
Experience working on multiple projects at once.
Have strong comm skills and know how to work with developers
Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
Data Architect
Data analyst job in Plano, TX
KPI Partners is a 5 times Gartner-recognized data, analytics, and AI consulting company. We are leaders in data engineering on Azure, AWS, Google, Snowflake, and Databricks. Founded in 2006, KPI has over 400 consultants and has successfully delivered over 1,000 projects to our clients. We are looking for skilled data engineers who want to work with the best team in data engineering.
Title: Senior Data Architect
Location: Plano, TX (Hybrid)
Job Type: Contract - 6 Months
Key Skills: SQL, PySpark, Databricks, and Azure Cloud
Key Note: Looking for a Data Architect who is Hands-on with SQL, PySpark, Databricks, and Azure Cloud.
About the Role:
We are seeking a highly skilled and experienced Senior Data Architect to join our dynamic team at KPI, working on challenging and multi-year data transformation projects. This is an excellent opportunity for a talented data engineer to play a key role in building innovative data solutions using Azure Native Services and related technologies. If you are passionate about working with large-scale data systems and enjoy solving complex engineering problems, this role is for you.
Key Responsibilities:
Data Engineering: Design, development, and implementation of data pipelines and solutions using PySpark, SQL, and related technologies.
Collaboration: Work closely with cross-functional teams to understand business requirements and translate them into robust data solutions.
Data Warehousing: Design and implement data warehousing solutions, ensuring scalability, performance, and reliability.
Continuous Learning: Stay up to date with modern technologies and trends in data engineering and apply them to improve our data platform.
Mentorship: Provide guidance and mentorship to junior data engineers, ensuring best practices in coding, design, and development.
Must-Have Skills & Qualifications:
Minimum 12+ years of overall experience in IT Industry.
4+ years of experience in data engineering, with a strong background in building large-scale data solutions.
4+ years of hands-on experience developing and implementing data pipelines using Azure stack experience (Azure, ADF, Databricks, Functions)
Proven expertise in SQL for querying, manipulating, and analyzing large datasets.
Strong knowledge of ETL processes and data warehousing fundamentals.
Self-motivated and independent, with a “let's get this done” mindset and the ability to thrive in a fast-paced and dynamic environment.
Good-to-Have Skills:
Databricks Certification is a plus.
Data Modeling, Azure Architect Certification.
Application Support Analyst
Data analyst job in Fort Worth, TX
We are looking for a tech-savvy problem solver for an Application Support Analyst role to assist in maintaining and configuring mission-critical systems that power our operations.
The Application Support Analyst is responsible for providing technical support, configuration and
maintenance for business applications, ensuring optimal performance and user satisfaction.
This role involves troubleshooting issues, writing reports, coordinating with different departments, and delivering high-quality support to end-users in a fast-paced environment.
Duties and Responsibilities:
Application Support: Monitor, configure, troubleshoot, and resolve issues related to enterprise applications, ensuring end user support and efficient system performance.
Incident Management: Respond to and resolve end users' requests in a timely manner, escalating complex issues to senior technical teams when necessary.
User Assistance: Provide guidance, configuration and training to end-users on application functionality, ensuring effective use of systems.
System Maintenance: Perform regular maintenance tasks, including software updates and configuration changes to ensure system reliability.
Data Integrity and Visualizations: Create reports and leverage data visualizations tools. Assist in application configurations to promote data integrity through data entry. Provide data Integration into industry specific applications from Acquisitions.
Documentation: Create and maintain detailed documentation related to business processes, issue resolutions, and system configurations.
Collaboration: Work closely with infrastructure team, end-users and application vendors to identify and implement application improvements.
Root Cause Analysis: Investigate recurring issues to identify root causes and recommend long-term solutions to prevent future occurrences.
Monitoring and Reporting: Utilize monitoring tools to track application performance and generate reports for stakeholders.
Compliance: Ensure applications adhere to organizational security policies and audit requirements.
Knowledge, Skills, and Abilities
Bachelor's/University degree or equivalent experience in the oil and gas industry preferred
1+ years of hands-on SQL experience preferred
Strong SQL skills - SQL queries, stored procedures, views, and SQL Agent Jobs
Data Visualization Tools - Report Writing: Power BI, Spotfire, SSRS
Application support expertise - Proficiency in troubleshooting software applications and understanding of IT Systems (e.g., Windows or cloud-based environments), vendor management and root cause analysis
Industry specific software - Preferred knowledge of Aries, Wellview/Siteview, Prodview, TabFusion, Quorum, Conduit, CygNet, Petra, ArcGIS
Technical knowledge - Relational Databases, ETL Processes, SSIS, API, XML
Coding - Some experience in writing and interpreting scripts, PowerShell, Python
Business process mindset - translate operational needs into technical solutions
Teamwork - Ability to work in a team environment and learn new skills quickly with little supervision
Personal skills - Communication, self-study and a desire to your grow knowledge base and a career
Equal Opportunity Employer
Prospective employees will receive consideration without discrimination because of race, color, religion, marital status, sex (including pregnancy, gender identity, and sexual orientation), national origin, age, veteran status, disability, or genetic information.
Azure Data Architect
Data analyst job in Dallas, TX
About Us:
LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ********************
Job Title: Data Architect
Work Location
Dallas, Texas
Job Description:
The ideal candidate will have a good understanding of big data technologies data engineering and cloud computing DWH Projects with a focus on Azure Databricks
Work closely with business stakeholders and other IT teams to understand requirements and define the scope for engagement with reasonable timeline
Ensure proper documentation of architecture processes while ensuring compliance with security and governance standards
Ensure best practices are followed by team in terms of code quality data security and scalability
Stay updated with the latest developments in Databricks and associated technologies to drive innovation
12 years of experience along with 5 years of data Analytics project experience
Experience with Azure Databricks notebook development and Delta Lake
Good understanding of Azure services like Azure Data Lake Azure Synapse and Azure Data Factory Fabric
Experience with ETLELT processes data warehousing and building data lakes
SQL skills and familiarity with NoSQL databases
Experience with CICD pipelines and version control systems like Git
Soft Skills
Excellent communication skills with the ability to explain complex technical concepts to nontechnical stakeholders
Strong problem-solving skills and a proactive approach to identifying and resolving issues
Leadership skills with the ability to manage and mentor a team of data engineers
Power BI for dashboarding and reporting
Microsoft Fabric for analytics and integration tasks
Spark Streaming for processing real time data streams
Over 12 years of IT experience including 4 years specializing in developing data ingestion and transformation pipelines using Databricks Synapse notebooks and Azure Data Factory
Good understanding on different domain industries with respect to data Analytics project DWH projects
Should be good in Excel and Power Point
Good understanding and experience with Delta tables Delta Lake and Azure Data Lake Storage Gen2
Experience in building and optimizing query layers using Databricks SQL
Familiarity with modern CICD practices especially in the context of Databricks and cloud native solutions
Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”):
Benefits and Perks:
Comprehensive Medical Plan Covering Medical, Dental, Vision
Short Term and Long-Term Disability Coverage
401(k) Plan with Company match
Life Insurance
Vacation Time, Sick Leave, Paid Holidays
Paid Paternity and Maternity Leave
The range displayed on each job posting reflects the minimum and maximum salary target for the position across all US locations. Within the range, individual pay is determined by work location and job level and additional factors including job-related skills, experience, and relevant education or training. Depending on the position offered, other forms of compensation may be provided as part of overall compensation like an annual performance-based bonus, sales incentive pay and other forms of bonus or variable compensation.
Disclaimer: The compensation and benefits information provided herein is accurate as of the date of this posting.
LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
Senior Data Engineer
Data analyst job in Plano, TX
Ascendion is a full-service digital engineering solutions company. We make and manage software platforms and products that power growth and deliver captivating experiences to consumers and employees. Our engineering, cloud, data, experience design, and talent solution capabilities accelerate transformation and impact for enterprise clients. Headquartered in New Jersey, our workforce of 6,000+ Ascenders delivers solutions from around the globe. Ascendion is built differently to engineer the next.
Ascendion | Engineering to elevate life
We have a culture built on opportunity, inclusion, and a spirit of partnership. Come, change the world with us:
Build the coolest tech for world's leading brands
Solve complex problems - and learn new skills
Experience the power of transforming digital engineering for Fortune 500 clients
Master your craft with leading training programs and hands-on experience
Experience a community of change makers!
Join a culture of high-performing innovators with endless ideas and a passion for tech. Our culture is the fabric of our company, and it is what makes us unique and diverse. The way we share ideas, learning, experiences, successes, and joy allows everyone to be their best at Ascendion.
*** About the Role ***
Job Title: Senior Data Engineer
Key Responsibilities:
Design, develop, and maintain scalable and reliable data pipelines and ETL workflows.
Build and optimize data models and queries in Snowflake to support analytics and reporting needs.
Develop data processing and automation scripts using Python.
Implement and manage data orchestration workflows using Airflow, Airbyte, or similar tools.
Work with AWS data services including EMR, Glue, and Kafka for large-scale data ingestion and processing.
Ensure data quality, reliability, and performance across data pipelines.
Collaborate with analytics, product, and engineering teams to understand data requirements and deliver robust solutions.
Monitor, troubleshoot, and optimize data workflows for performance and cost efficiency.
Required Skills & Qualifications:
8+ years of hands-on experience as a Data Engineer.
Strong proficiency in SQL and Snowflake.
Extensive experience with ETL frameworks and data pipeline orchestration tools (Airflow, Airbyte, or similar).
Proficiency in Python for data processing and automation.
Hands-on experience with AWS data services, including EMR, Glue, and Kafka.
Strong understanding of data warehousing, data modeling, and distributed data processing concepts.
Nice to Have:
Experience working with streaming data pipelines.
Familiarity with data governance, security, and compliance best practices.
Experience mentoring junior engineers and leading technical initiatives.
Salary Range: The salary for this position is between $130,000- $140,000 annually. Factors which may affect pay within this range may include geography/market, skills, education, experience, and other qualifications of the successful candidate.
Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: [medical insurance] [dental insurance] [vision insurance] [401(k) retirement plan] [long-term disability insurance] [short-term disability insurance] [5 personal days accrued each calendar year. The Paid time off benefits meet the paid sick and safe time laws that pertains to the City/ State] [10-15 days of paid vacation time] [6 paid holidays and 1 floating holiday per calendar year] [Ascendion Learning Management System]
Want to change the world? Let us know.
Tell us about your experiences, education, and ambitions. Bring your knowledge, unique viewpoint, and creativity to the table. Let's talk!
Senior Data Engineer
Data analyst job in Dallas, TX
About Us
Longbridge Securities, founded in March 2019 and headquartered in Singapore, is a next-generation online brokerage platform. Established by a team of seasoned finance professionals and technical experts from leading global firms, we are committed to advancing financial technology innovation. Our mission is to empower every investor by offering enhanced financial opportunities.
What You'll Do
As part of our global expansion, we're seeking a Data Engineer to design and build batch/real-time data warehouses and maintain data platforms that power trading and research for the US market. You'll work on data pipelines, APIs, storage systems, and quality monitoring to ensure reliable, scalable, and efficient data services.
Responsibilities:
Design and build batch/real-time data warehouses to support the US market growth
Develop efficient ETL pipelines to optimize data processing performance and ensure data quality/stability
Build a unified data middleware layer to reduce business data development costs and improve service reusability
Collaborate with business teams to identify core metrics and data requirements, delivering actionable data solutions
Discover data insights through collaboration with the business owner
Maintain and develop enterprise data platforms for the US market
Qualifications
7+ years of data engineering experience with a proven track record in data platform/data warehouse projects
Proficient in Hadoop ecosystem (Hive, Kafka, Spark, Flink), Trino, SQL, and at least one programming language (Python/Java/Scala)
Solid understanding of data warehouse modeling (dimensional modeling, star/snowflake schemas) and ETL performance optimization
Familiarity with AWS/cloud platforms and experience with Docker, Kubernetes
Experience with open-source data platform development, familiar with at least one relational database (MySQL/PostgreSQL)
Strong cross-department collaboration skills to translate business requirements into technical solutions
Bachelor's degree or higher in Computer Science, Data Science, Statistics, or related fields
Comfortable working in a fast-moving fintech/tech startup environment
Bonus Point:
Experience with DolphinScheduler and SeaTunnel is a plus
Application Analyst
Data analyst job in Irving, TX
Application Analyst II
REMOTE
Application Analyst II-REMOTE -Experience with Clinical 3rd Party Applications
Experience with cardiology, radiology, lab, PACS, Sleep Lab (Cadwell) and PFT (Breeze Suite)
Monday-Friday
Senior GRC InfoSec Analyst
Data analyst job in Plano, TX
Direct Hire (No C2C or third-party submissions)
Schedule: 2 days onsite weekly
Interview Process: 1st round is onsite; 2nd round virtual
Responsible for driving the development, implementation, communication, and maintenance of technology policies, standards, and procedures that align with industry standards and regulatory requirements. Ensures technology processes adhere to regulatory requirements, effectively manage risks, and establish strong governance practices. Develops and implements controls, monitors compliance, and supports risk management activities.
Requirements:
Bachelor's Degree in Information Security, Computer Science, Information Technology, or a related field preferred.
Minimum of six (6)+ years' experience working in Cybersecurity GRC, policy development, risk management, or a similar field.
Experience with GRC tools (e.g., Archer, ServiceNow, OneTrust).
Proficiency in using data analysis and reporting tools (e.g., Excel, Power BI).
Relevant certifications such as CISM and/or CISA are highly desirable.
Other must haves:
Experience managing policy governance function such as leading policy updates, installing new policy, aligning regulatory & best practices
Technical Process - needs expertise around understanding of alignment & frameworks and will be working with Product Owner
Regulatory Frameworks
Preferred:
Financial services or banking background
ServiceNow IRM (Integrated Risk Management) experience
Duties:
Lead the development and implementation of comprehensive cybersecurity and IT policies, standards, and guidelines.
Continuously evaluate and update cybersecurity and IT policies to ensure they remain current and effective.
Ensure policies comply with relevant laws, regulations, and industry standards (e.g., NIST, FFIEC, GLBA, NYDFS, SOX, PCI-DSS).
Collaborate with cross-functional teams-including IT, legal, compliance, and other departments-to ensure cybersecurity policies align with business objectives.
Translate complex information and documentation into clear, user-friendly concepts.
Provide specialized expertise and consultation to perform framework-oriented risk assessments, identify deficiencies, generate reports, and recommend prioritized, actionable solutions to mitigate risks and enhance overall security posture.
Stay informed about the latest cybersecurity threats, trends, and best practices. Maintain accurate and up-to-date records of policy reviews, risk assessments, training activities, and incident responses.
Benchmark organizational policies against industry standards and best practices.
Develop and implement governance frameworks for cybersecurity policy management.
Monitor key performance indicators, conduct gap analyses and risk assessments, and implement frameworks as needed. Test and monitor the effectiveness of controls.
Establish feedback loops and analyze metrics to continuously improve cybersecurity policies based on audit findings, incident reviews, and emerging threats.
Lead and support internal and external audits and assessments of cybersecurity policies and practices. Ensure identified audit and assessment findings are tracked to closure.
Maintain comprehensive documentation of all cybersecurity policies, procedures, and related activities. Communicate policy requirements and updates to all relevant stakeholders.
Identify opportunities for innovation and improvement in cybersecurity policy and practice. Propose suitable mitigation strategies and verify the effectiveness of remediation plans.
Senior Data Engineer (USC AND GC ONLY)
Data analyst job in Richardson, TX
Now Hiring: Senior Data Engineer (GCP / Big Data / ETL)
Duration: 6 Months (Possible Extension)
We're seeking an experienced Senior Data Engineer with deep expertise in Data Warehousing, ETL, Big Data, and modern GCP-based data pipelines. This role is ideal for someone who thrives in cross-functional environments and can architect, optimize, and scale enterprise-level data solutions on the cloud.
Must-Have Skills (Non-Negotiable)
9+ years in Data Engineering & Data Warehousing
9+ years hands-on ETL experience (Informatica, DataStage, etc.)
9+ years working with Teradata
3+ years hands-on GCP and BigQuery
Experience with Dataflow, Pub/Sub, Cloud Storage, and modern GCP data pipelines
Strong background in query optimization, data structures, metadata & workload management
Experience delivering microservices-based data solutions
Proficiency in Big Data & cloud architecture
3+ years with SQL & NoSQL
3+ years with Python or similar scripting languages
3+ years with Docker, Kubernetes, CI/CD for data pipelines
Expertise in deploying & scaling apps in containerized environments (K8s)
Strong communication, analytical thinking, and ability to collaborate across technical & non-technical teams
Familiarity with AGILE/SDLC methodologies
Key Responsibilities
Build, enhance, and optimize modern data pipelines on GCP
Implement scalable ETL frameworks, data structures, and workflow dependency management
Architect and tune BigQuery datasets, queries, and storage layers
Collaborate with cross-functional teams to define data requirements and support business objectives
Lead efforts in containerized deployments, CI/CD integrations, and performance optimization
Drive clarity in project goals, timelines, and deliverables during Agile planning sessions
📩 Interested? Apply now or DM us to explore this opportunity! You can share resumes at ********************* OR Call us on *****************
Azure Data Engineer Sr
Data analyst job in Irving, TX
Minimum 7 years of relevant work experience in data engineering, with at least 2 years in a data modeling.
Strong technical foundation in Python, SQL, and experience with cloud platforms (Azure,).
Deep understanding of data engineering fundamentals, including database architecture and design, Extract, transform and load (ETL) processes, data lakes, data warehousing, and both batch and streaming technologies.
Experience with data orchestration tools (e.g., Airflow), data processing frameworks (e.g., Spark, Databricks), and data visualization tools (e.g., Tableau, Power BI).
Proven ability to lead a team of engineers, fostering a collaborative and high-performing environment.
Data Engineer
Data analyst job in Irving, TX
W2 Contract to Hire Role with Monthly Travel to the Dallas Texas area
We are looking for a highly skilled and independent Data Engineer to support our analytics and data science teams, as well as external client data needs. This role involves writing and optimizing complex SQL queries, generating client-specific data extracts, and building scalable ETL pipelines using Azure Data Factory. The ideal candidate will have a strong foundation in data engineering, with a collaborative mindset and the ability to work across teams and systems.
Duties/Responsibilities:Develop and optimize complex SQL queries to support internal analytics and external client data requests.
Generate custom data lists and extracts based on client specifications and business rules.
Design, build, and maintain efficient ETL pipelines using Azure Data Factory.
Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality solutions.
Work with Salesforce data; familiarity with SOQL is preferred but not required.
Support Power BI reporting through basic data modeling and integration.
Assist in implementing MLOps practices for model deployment and monitoring.
Use Python for data manipulation, automation, and integration tasks.
Ensure data quality, consistency, and security across all workflows and systems.
Required Skills/Abilities/Attributes:
5+ years of experience in data engineering or a related field.
Strong proficiency in SQL, including query optimization and performance tuning.
Experience with Azure Data Factory, with git repository and pipeline deployment.
Ability to translate client requirements into accurate and timely data outputs.
Working knowledge of Python for data-related tasks.
Strong problem-solving skills and ability to work independently.
Excellent communication and documentation skills.
Preferred Skills/ExperiencePrevious knowledge of building pipelines for ML models.
Extensive experience creating/managing stored procedures and functions in MS SQL Server
2+ years of experience in cloud architecture (Azure, AWS, etc)
Experience with ‘code management' systems (Azure Devops)
2+ years of reporting design and management (PowerBI Preferred)
Ability to influence others through the articulation of ideas, concepts, benefits, etc.
Education and Experience:
Bachelor's degree in a computer science field or applicable business experience.
Minimum 3 years of experience in a Data Engineering role
Healthcare experience preferred.
Physical Requirements:Prolonged periods sitting at a desk and working on a computer.
Ability to lift 20 lbs.
Data Engineer
Data analyst job in Dallas, TX
Must be local to TX
Data Engineer - SQL, Python and Pyspark Expert (Onsite - Dallas, TX)
Data Engineer with strong proficiency in SQL, Python, and PySpark to support high-performance data pipelines and analytics initiatives. This role will focus on scalable data processing, transformation, and integration efforts that enable business insights, regulatory compliance, and operational efficiency.
Key Responsibilities
Design, develop, and optimize ETL/ELT pipelines using SQL, Python, and PySpark for large-scale data environments
Implement scalable data processing workflows in distributed data platforms (e.g., Hadoop, Databricks, or Spark environments)
Partner with business stakeholders to understand and model mortgage lifecycle data (origination, underwriting, servicing, foreclosure, etc.)
Create and maintain data marts, views, and reusable data components to support downstream reporting and analytics
Ensure data quality, consistency, security, and lineage across all stages of data processing
Assist in data migration and modernization efforts to cloud-based data warehouses (e.g., Snowflake, Azure Synapse, GCP BigQuery)
Document data flows, logic, and transformation rules
Troubleshoot performance and quality issues in batch and real-time pipelines
Support compliance-related reporting (e.g., HMDA, CFPB)
Required Qualifications
6+ years of experience in data engineering or data development
Advanced expertise in SQL (joins, CTEs, optimization, partitioning, etc.)
Strong hands-on skills in Python for scripting, data wrangling, and automation
Proficient in PySpark for building distributed data pipelines and processing large volumes of structured/unstructured data
Experience working with mortgage banking data sets and domain knowledge is highly preferred
Strong understanding of data modeling (dimensional, normalized, star schema)
Experience with cloud-based platforms (e.g., Azure Databricks, AWS EMR, GCP Dataproc)
Familiarity with ETL tools, orchestration frameworks (e.g., Airflow, ADF, dbt)
Data Engineer
Data analyst job in Dallas, TX
Junior Data Engineer
DESCRIPTION: BeaconFire is based in Central NJ, specializing in Software Development, Web Development, and Business Intelligence; looking for candidates who are good communicators and self-motivated. You will play a key role in building, maintaining, and operating integrations, reporting pipelines, and data transformation systems.
Qualifications:
Passion for data and a deep desire to learn.
Master's Degree in Computer Science/Information Technology, Data Analytics/Data
Science, or related discipline.
Intermediate Python. Experience in data processing is a plus. (Numpy, Pandas, etc)
Experience with relational databases (SQL Server, Oracle, MySQL, etc.)
Strong written and verbal communication skills.
Ability to work both independently and as part of a team.
Responsibilities:
Collaborate with the analytics team to find reliable data solutions to meet the business needs.
Design and implement scalable ETL or ELT processes to support the business demand for data.
Perform data extraction, manipulation, and production from database tables.
Build utilities, user-defined functions, and frameworks to better enable data flow patterns.
Build and incorporate automated unit tests, participate in integration testing efforts.
Work with teams to resolve operational & performance issues.
Work with architecture/engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and adhered to.
Compensation: $65,000.00 to $80,000.00 /year
BeaconFire is an e-verified company. Work visa sponsorship is available.
GCP Data Engineer
Data analyst job in Fort Worth, TX
Job Title: GCP Data Engineer
Employment Type: W2/CTH
Client: Direct
We are seeking a highly skilled Data Engineer with strong expertise in Python, SQL, and Google Cloud Platform (GCP) services. The ideal candidate will have 6-8 years of hands-on experience in building and maintaining scalable data pipelines, working with APIs, and leveraging GCP tools such as BigQuery, Cloud Composer, and Dataflow.
Core Responsibilities:
• Design, build, and maintain scalable data pipelines to support analytics and business operations.
• Develop and optimize ETL processes for structured and unstructured data.
• Work with BigQuery, Cloud Composer, and other GCP services to manage data workflows.
• Collaborate with data analysts and business teams to ensure data availability and quality.
• Integrate data from multiple sources using APIs and custom scripts.
• Monitor and troubleshoot pipeline performance and reliability.
Technical Skills:
o Strong proficiency in Python and SQL.
o Experience with data pipeline development and ETL frameworks.
• GCP Expertise:
o Hands-on experience with BigQuery, Cloud Composer, and Dataflow.
• Additional Requirements:
o Familiarity with workflow orchestration tools and cloud-based data architecture.
o Strong problem-solving and analytical skills.
o Excellent communication and collaboration abilities.