Data Analyst
Remote Data Modeler Job
Sr Data Analyst
Hybrid: 4 days onsite and one day working from home each week.
Compensation: 85-105K Salary + Benefits
A $150 million national consumer business based in Nashville with 400 employees is hiring a talented Senior Data Analyst to join their team. Working within the Finance Team and reporting to the Director, the Data Analyst will be a strategic partner with operations, marketing, product, and other internal teams to provide data analytics and dashboards using Tableau to drive business decisions.
The Data Analyst will collaborate with the data engineering team to identify necessary raw data (customer activity, demographics, location performance, product performance) for providing accurate real-time insights to internal departments to improve services and revenue growth. This role will play an integral role in operational decisions by proactively analyzing and interpreting data to provide data-driven insights and recommendations.
Provide data-driven insights and analysis for operations, marketing, and products
Develop and maintain business dashboards, translating data into actionable visualizations
Manage reporting and analysis including KPIs, performance dashboards, and operation trackers
Review data to identify key trends in performance, utilization, and customer demographics/segmentation
Requirements
Bachelor's degree in Business
2+ years of experience in data mining, statistical analysis, and modeling
Must be located in the Nashville Metro Area
Experience building reports and dashboards using Microsoft Excel (Modeling) and Tableau
Intermediate SQL knowledge preferred
Advanced proficiency in Microsoft Office (including Excel modeling)
Experience analyzing large datasets to identify trends
Green Card or US Citizenship (required)
Must live in the Nashville area or be able to relocate within 3 weeks of offer
Lead Data Scientist
Remote Data Modeler Job
Apex is seeking a dynamic Lead Consultant with strong consultative skills to serve as a Lead Data Scientist. The role is hands-on and intensely focused on developing advanced machine learning and deep learning models for real-world initiatives. As a Data Scientist, you will be at the forefront of research and development, focusing on the design, implementation, and fine-tuning of ML/DL models.
Role / Title: Lead Data Scientist
Location: 100% remote
Duration: Fulltime
Pay: Highly competitive & negotiable
Lead Consultant (Data Scientist with Machine Learning)
Responsibilities:
Leading the exploration and application of Machine Learning and Deep Learning models to drive business insights and support data-driven decision making across projects.
Demonstrate fluency in Python and ability to utilize, optimize, and alter off the shelf libraries and code.
Implement and deploy ML models using TensorFlow, Keras, Caffe, MXNet, PyTorch and other frameworks with a focus on developing complex programs, custom algorithms, and optimized model performance.
Develop and deploy a variety of ML models including but not limited to recommendation, anomaly detection, forecasting, personalization predictive models and other statistical methods.
Perform experiments and comparative analysis to evaluate the effectiveness of different model architectures and fine-tune hyperparameters for optimal performance
Design and implement fine-tuning approaches and training models.
Lead feature engineering efforts, including data preprocessing, aggregation, and transformation for structured, unstructured, and semi-structured data to enhance model accuracy and performance.
Collaborate with cross-functional teams to identify business problems and design end-to-end data science solutions.
Mentor and guide junior data scientists, fostering best practices in model development, deployment, and optimization.
Promote data science and machine learning ethics, ensuring models are developed and deployed with fairness and mitigating bias and adhering to privacy regulations and ethical standards.
Manage and oversee projects, ensuring timely delivery and adherence to budget and quality standards.
Demonstrate strong technical knowledge and implementation skills. Stay current on relevant technology trends and practices. Build trust and respect among internal and external stakeholders and demonstrate collaborative teamwork.
Produce high quality deliverables, meet project deadlines, and take responsibility for engagement success. Demonstrate a passion for quality and process improvement.
Demonstrate professional level consulting skills and communication/presentation skills.
Continually innovate, seek creative solutions, and find new ways of adding value. Listen and seek to understand the client and meet their needs, providing consultative guidance.
Stay attuned to the future needs of the client and work with internal resources to identify opportunities.
Proactively provide solutions and approach adversity with a solution-focused mindset.
Perform assessments of clients' environments and design robust solutions and roadmaps.
Requirements:
Masters or PHD in machine learning or alternatively a comparable industry career, with significant modeling experience on delivering products.
8+ years of experience writing and deploying production quality models in languages such as Python.
8+ years of statistical predictive analytics experience, including industry-specific predictive modeling.
5+ years of hands-on technical experience with extensive background in deep learning networks using TensorFlow, Keras, Caffe, or similar.
Experience in a variety of machine learning models such as recommendation, anomaly detection, forecasting, personalization predictive models and other statistical methods.
Comprehensive knowledge and hands-on experience with fine-tuning approaches and training models.
Demonstrated ability to work in a client-facing capacity, understanding their needs and translating them into technical solutions
Hands-on technical experience with extensive background in deep learning networks using TensorFlow, Keras, Caffe, or similar
Outstanding communication skills, both written and verbal, to effectively collaborate with internal teams and present solutions to clients
Demonstrated ability to work in a client-facing capacity, understanding their needs and translating them into technical solutions
Stay current with the latest developments in AI, machine learning, and data science, and actively contribute to the advancement of our technology stack and capabilities
Experience with Python scripts to drive data science workflows, have experience using SQL, and managing and merging of disparate data sources statistical analysis; and data mining algorithms.
In-depth experience in Natural Language Processing (NLP), with a particular emphasis on Large Language Models (LLMs) and Transformer architectures.
Familiarity with MLOps processes and best practices for deployment.
Desirables:
Having participated in Kaggle or other DL competitions
Relentlessly staying on top of the state of the art AI/ML papers
Having had to deploy a model with aggressive inference speed requirements
Experience in Amazon SageMaker, Databricks or similar tools to build, train and deploy machine learning models.
Data Architect (031-24)
Data Modeler Job In Quantico, VA
We seek a Data Architect who will thrive in a challenging, rewarding, process-oriented environment. Candidate must have active TS/SCI Clearance.
As a contributor to efforts of the Marine Corps Warfighting Lab in the Wargaming Division, you will be asked to provide Scientific and Technical Deliverables by performing Research, Development, Tests, and Evaluation to identify future challenges and opportunities, develop warfighting concepts and comprehensively explore options to inform capability development through the combat development process and meet the challenges of future operating environments. The Data Architect supports configuration management planning and describes provisions for configuration identification, change control, configuration status accounting, and configuration audits. They further support configuration planning by identifying and maintaining the original configuration of requirements documentation, design documentation, software, and related documentation. You will be responsible for configuration change control, supporting the change process, and ensuring that only approved and validated changes are incorporated into product documents and related software. These efforts will enhance the ability of the Marine Corps Warfighting Lab Wargaming Division to plan, support, integrate, execute, evaluate and report live force experiments.
Location:
This position is located onsite in Quantico, VA
Responsibilities:
Design a system's data models, data flow, interfaces, and infrastructure to meet the information requirements of a business or mission.
Develop and implement an overall organization data strategy that aligns with business processes and objectives.
Identify data sources, both internal and external, and work out a plan for data management that is aligned with the data strategy.
Monitor and optimize the data infrastructure.
Examine and identify database structural necessities by evaluating operations, applications, and programming.
Prepare database design and architecture reports.
Advise higher-level leadership on critical data management issues.
Consult with customers and key stakeholders to evaluate functional requirements for Artificial Intelligence and data applications.
Collaborate with appropriate personnel to address classified data requirements and configure technical solutions for data integrity and security.
Provide recommendations on new database technologies and architectures.
Evaluate data storage requirements, prepare and implement data backup, recovery, and secondary data storage and disaster recovery.
Plan and manage effective storage capacity strategies.
Confer with systems analysts, engineers, system owners, etc… to design applications for data management.
Analyze and plan for anticipated changes in data capacity requirements.
Required Qualifications:
Security Clearance:
US Citizenship required
TS/SCI clearance required
Education and Experience:
Bachelor of Science Degree in IT or data management
5-12 years of experience in IT (aligned to DoD Cybersecurity Workforce 653)
Requires DoD Cyber certification as an IAT level II (Any of the following: CCNA-Security, CySA+, GICSP, GSEC, Security+ CE, CND, SSCP, CASP+ CE, CCNP Security, CISA, CISSP (or Associate), GCED, GCIH, CCSP)
Excellent communication skills
Ability to work independently and as part of a team
Analytical and problem-solving skills to troubleshoot systems problems
Preferred Additional Skills:
Master's degree in IT
Certified Data Professional (CDP)
Certified Data Management Professional (CDMP)
Anglicotech, LLC is an established, rapidly growing, veteran-owned small business that provides Global Logistics and Supply Chain management, systems and analysis, Cybersecurity and NIST SP 800-171 compliance solutions, and Enterprise Information Technology Implementation and Services.
Anglicotech, LLC is an Equal Opportunity Employer committed to supporting and retaining a diverse and talented workforce. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against based on disability.
Anglicotech, LLC offers competitive compensation, benefits, and long-term career opportunities.
For more information or to apply, visit our website at ***************************
Business Data Analyst
Data Modeler Job In Richmond, VA
Hybrid in Richmond, VA The ideal candidate has expertise in business data analysis, excels at communicating technical information to non-technical audiences, and ability to leverage tools like SQL, Power BI, and Tableau. Essential Duties and Responsibilities:
·Work effectively with business process owners and data owners to ensure data related business objectives are understood and documented.
·Gather, evaluate and document requirements related to data for new systems or enhancements based on business user input, technical requirements and constraints.
·Manage multiple projects, timelines, and priorities.
·Support individual team members in the data office as well as various business units by designing customizable tabular or visual reports with available tools.
Data Analysis
·Develop innovative data strategies for meeting business requirements, including leveraging the capabilities of existing technology tools or acquiring/developing new technology tools.
·Design data-focused automated and manual solutions to solve business problems and achieve business objectives.
·Structure large data sets to find usable information.
·Collaborate with various business and technology teams to collect and analyze data.
·Create reports and visualizations for internal teams and stakeholders.
·Create presentations and reports based on recommendations and findings.
·Prepare reports, flowcharts, diagrams, detailed documentation and other written materials.
·Present findings through various written and verbal communication channels with key users and agency executives.
Data Office Support
·Serve as subject matter expert on Data Analytics.
·Act as a liaison between the business teams and IT teams for data related initiatives.
·Manage risk and provide timely status reports and project updates to stakeholders.
Provide cross-team support when needed to achieve technical goals, knowledge sharing,
·Agency Performance Outcomes (APOs) and operational measures (OMs).
·Participate in software/hardware security reviews and implements best practices for the Data Office.
·Establish KPIs in coordination with business units to measure the effectiveness of business decisions.
·Perform special projects as assigned.
·Other duties may be assigned.
Required Skills and Experience
·Five years' experience working in data analysis, business analysis, data science, data warehousing in Pensions Benefit management, or related industry
·Experience working in an Agile development environment.
·Ability to work independently and as part of a team that includes business and technical stakeholders.
·Excellent problem-solving and analytical skills.
·Strong communication and interpersonal skills.
Desired Skills
·Three or more years' experience designing dashboards and reports using BI tools such as, Power BI and/or Tableau.
·Strong SQL skills for querying and analyzing data.
·Knowledge in Excel, DAX, Python.
·Software development background.
Data Architect
Data Modeler Job In Arlington, VA
Candidates must be residents of DC, MD, or VA.
Daily Responsibilities:
This senior-level position is instrumental in designing, updating, and maintaining mission-critical systems.
Ability to thrive in a fast-paced, dynamic, and high-visibility IT system environment.
Translate business needs into long-term architecture solutions.
Design new architectures, re-design existing architectures, and develop the database code to create the data structures.
Develop data warehousing blueprints, evaluate hardware and software platforms, and integrate systems.
Review and develop object and data models and the metadata repository to structure the data for better management and quicker access.
Define, design, and build relational and dimensional databases and implement all database objects including schemas, tables, clusters, indexes, views, sequences, packages, and procedures based on system requirements.
Design, create, deploy, and manage how data is stored, consumed, integrated, processed, and used by different data entities and IT systems, both Cloud and on-premises applications as well as other applications using or processing the data in some way.
Create data flow diagrams and other architecture-related documentation and operational /manuals.
Manage input of various data sources, define database architecture, develop database code, construct, and maintain ETL-based processes, design database reports.
Create code to construct, access, and modify databases using stored procedures, triggers, and functions and implement and optimize queries and stored procedures.
Expert data migration expertise and knowledge with data profiling and migration tasks.
Perform data access analysis, design, and archive/recovery design and implementation, especially in a Microsoft SQL Server environment.
Develop strategies for data acquisitions, archive recovery, and implementation of a database.
Provide guidance and support to all technical positions including system architects, developers, and database administrators.
The requirements as stated above reflect industry standards.
Required Education & Experience:
Education:
BS/BA degree in Computer Science, Information Sciences, or related IT discipline OR Allowable Substitution: Additional ten (10) years of related professional experience can be substituted for a BS/BA degree.
Experience:
10 years of professional experience in database architecture
12 years related to the software development field.
At least 3 years of experience managing data integration between SaaS solutions and between SaaS and .NET applications.
The ability to thrive in a fast-paced and high-visibility IT environment is paramount.
Possesses basic Salesforce knowledge.
Experience translating business requirements into architectural solutions and illustrating/documenting the architecture.
Hands-on experience supporting the Oracle database in a cloud environment, preferably for the Salesforce platform on the Amazon Web Services (AWS) Cloud.
Expert-level knowledge of Microsoft SQL Server.
Hands-on integration experience using SQL Server Integration Services (SSIS) package.
Expertise in designing, creating, deploying, and managing how data is stored, consumed, integrated, processed, and used by different data entities and IT systems, both Cloud and on-premises applications as well as other applications using or processing the data in some way.
Expert with data definition language to create, code, and implement relational and dimensional database objects, complex stored procedures, functions, triggers, and data access controls.
Expert in a data warehouse environment, which includes data design, database architecture, and metadata repository creation.
Experience reviewing, deciphering, testing, and troubleshooting database structures, complex stored procedures, functions, and triggers to structure the data for better management and quicker access.
Experience masking and replicating large data sets (for privacy and testing purposes) using enterprise tools.
Expert in backup integrity and disaster recovery situations.
Enforce change management processes in each environment
Experience with Copado for Salesforce DevOps preferred.
Experience with C#, rest API, and XML skills are a plus.
The minimum experience and education requirements for this position are due to the complexity, at the program level, of the tasks at hand.
Preferred Certifications:
Salesforce Certified Administrator
Salesforce Certified Service Cloud Consultant
Clearance: Secret or Ability to obtain Secret Clearance
Data Warehouse Architect
Data Modeler Job In Virginia
Responsibilities:
Working with our business partners to develop data mapping from one system to another is required. As such the candidate will be expected to manage data mapping from a legacy system to a new system.
The candidate will have a solid attention to detail and be able to manage tedious tasks. The candidate needs to be proficient with Azure Data Factory Pipelines and Sql Server stored procedures or SSIS packages.
There may also be some work required to create or support packages on local servers as well
Required/Desired Skills & Abilities:
Candidate will need to be experienced with Azure Data Factory Pipeline or SSIS packages for Sql Server environments Required 3 Years
Candidate will have experience with mapping data from one system to another. Required 5 Years
Candidate should have excellent communication skills (written and verbal) Required
Candidate will have successful experience with estimating level of effort and managing deadlines Required 3 Years
Candidate should have experience managing source code in a GIT repository. Highly desired 2 Years
Ideally, candidate would have experience working with the Central Square Finance Enterprise ERP Highly desired
Ideally, candidate would have experience with the Workday tools and implementation methodology. Highly desired
Candidate will need to be self-motivated, independent, and willing to work flexible hours. Required
Candidate will be expected to successfully complete training on HIPAA and PII Required
Data Warehouse Architect 2
Data Modeler Job In Richmond, VA
Local Richmond, VA candidates required for onsite work
Required Skills:
3 Years - Candidate will need to be experienced with Azure Data Factory Pipeline or SSIS packages for Sql Server environments
5 Years - Candidate will have experience with mapping data from one system to another.
Candidate should have excellent communication skills (written and verbal)
3 Years - Candidate will have successful experience with estimating level of effort and managing deadlines.
Candidate will need to be self motivated, independent, and willing to work flexible hours.
Candidate is required to have a dependable high speed internet connection because some work will be performed from home.
Candidate will be expected to pass CJIS background checks.
Candidate will be expected to successfully complete training on HIPAA and PII
Highly desired Skills:
2 Years - Candidate should have experience managing source code in a GIT repository.
Ideally, candidate would have experience working with the Central Square Finance Enterprise ERP
Ideally, candidate would have experience with the Workday tools and implementation methodology.
Data Analyst Senior
Data Modeler Job In McLean, VA
Immediate need for a talented Data Analyst Senior. This is a 09+ Months Contract opportunity with long-term potential and is located in McLean, VA (Hybrid). Please review the job description below and contact me ASAP if you are interested.
Job ID:25-55385
Pay Range: $45 - $50/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
Using SQL to deliver views to support business decisions.
Developing end to end data lineage for data elements used in Financial Reporting processes.
Business requirement gathering and translating it into user stories.
Meeting business objectives.
Ability to validate data to ensure completeness and accuracy.
Key Requirements and Technology Experience:
Skills-Data analysis, SQL/Relational databases, ETL data mapping.
5 or more years of experience in providing applications support for Financial Reporting processes along with complimentary experience in collaborating with a team.
Experience in developing end to end data lineage for critical data elements used in Financial Reporting processes.
Experience in collecting requirements, crafting and developing integrations, configuring Commercial-Off-The-Shelf (COTS) software products.
Ability to quickly learn new software applications and effectively collaborate with stakeholders to influence outcomes.
Bachelor's degree in finance, Business, Computer Science, Information Systems, a related field or equivalent experience.
Strong problem-solving and analytical skills while maintaining ability to understand different perspectives.
Expertise in querying, analyzing, reconciling, and testing complex relational and non-relational data.
Knowledge of accounting, subledger and financial reporting concepts.
Strong consultation and communication skills.
Our client is a leading Banking Industry and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Data Architect
Remote Data Modeler Job
Azure Data Architect - Remote Contract
We have an exciting opportunity for an Azure Data Architect to join a fast-growing Food and Beverage company remotely from anywhere in the US on a long-term contract. This role will be pivotal in driving their migration to Azure, leveraging your expertise in Azure Data Lake, SQL, Databricks, and PowerShell Scripting.
Key Responsibilities:
Design and implement data solutions on the Azure platform.
Lead the migration of data infrastructure to Azure, ensuring a smooth transition and minimal disruption.
Develop and optimize Azure Data Lake, SQL databases, and Databricks environments.
Collaborate with cross-functional teams to gather requirements and implement solutions that meet business needs.
Utilize PowerShell scripting to automate tasks and streamline processes.
The ideal candidate will have:
Extensive experience as an Azure data architect, with a strong background in Azure data Lake, SQL, Databricks, and PowerShell Scripting.
Deep understanding of data architecture principles and best practices.
Excellent communication skills and the ability to collaborate effectively with stakeholders at all levels.
Self-motivated, with a passion for staying updated on the latest technologies and trends in data management.
For the suitable candidate, our client can offer:
Opportunity to work remotely from anywhere in the US
Competitive hourly rate with flexibility.
Chance to make a significant impact on the data infrastructure of a leading company in the food and beverage industry.
A collaborative and dynamic work environment with a focus on innovation and continuous improvement.
To apply please send your CV to this job posting!
Data Analyst
Data Modeler Job In Herndon, VA
Required Skills:
Ideal candidate should have a degree in a quantitative field (e.g., mathematics, computer science, physics, economics, engineering, statistics, operations research, quantitative social science, etc.).
Basic Knowledge on software development principles and architecture.
Good analytical and problem-solving abilities.
Ability to break down and understand complex business problems, define a solution and implement it using advanced quantitative methods.
Familiarity with programming for data analysis; ideally Python, SQL, or R.
Solid oral and written communication skills, especially around analytical concepts and methods.
Great work ethic and intellectual curiosity.
Knowledge of Cloud technologies such as AWS or Google Cloud.
Knowledge of any relational database such as My SQL.
Must be a team player with excellent communication and problem-solving skills and have experience working with customers across teams.
Senior Data Engineer
Remote Data Modeler Job
About the Role:
We are seeking a skilled AWS Data Engineer to join our team and help shape the future of data management and analytics for our organization. In this role, you will work with a variety of AWS services to design, build, and optimize data pipelines, enabling data-driven decision-making at scale.
Requirements:
4+ years of experience in data engineering or related fields.
Strong expertise in SQL and experience with Redshift stored procedures.
Proficiency with AWS services, including S3, Lambda, DynamoDB, Step Functions, RDS, SNS, and SQS.
Experience with data engineering tools such as EMR, Glue, and Redshift.
Strong knowledge of databases, including PostgreSQL and Aurora.
Experience with data modeling techniques, including Star/Snowflake Schema Design.
Proficiency in Python and PySpark for data manipulation and processing.
Knowledge of serverless architectures and experience with AWS Lambda and Step Functions.
Strong SQL and PL/SQL skills for querying and managing data.
Familiarity with data warehousing concepts, including Datamarts and Multi-Dimensional OLAP.
Familiarity with data processing frameworks such as Apache Spark or Hadoop.
Knowledge of programming languages such as Python or Java for data manipulation and automation.
Strong understanding of data governance and best practices for data quality.
Experience working in Agile development environments.
Excellent problem-solving skills and attention to detail.
Strong verbal and written communication skills, with the ability to present information clearly to stakeholders.
Nice to Have:
Knowledge of SAS, familiarity with DevOps tools (e.g., Jenkins, Bitbucket, GitLab, Terraform) , experience with testing automation.
Other Details:
Location: Reston, VA. This is a Remote job, but local candidates are preferable.
Length: 2+ years, long term.
Client: Largest fintech
Open to W2 full-time with benefits or C2C.
The difference between something good and something great is attention to detail - AVM Consulting
If you think you fit the role go ahead and book a interview slot from the following link
Data Quality Analyst - Mid (Secret Clearance Required)
Data Modeler Job In Sterling, VA
Paradyme Management is a rapidly growing government technology leader that puts service first, for its customers, its team and the communities it supports. Paradyme harnesses DevSecOps and Agile development processes to deliver exceptional results for digital transformations. Based in Tysons Corner, VA, Paradyme's award-winning culture sets it apart through its team's deep commitment to service and collaboration with its customers, each other and the community. Learn more at ***************************
We are seeking an Data Quality Analyst in support of a Federal Agency customer. This position requires an ACTIVE SECRET security clearance, and the ability to work onsite in the DC metro Area. With customer approval, a hybrid schedule will be available.
Description:
Organizes data by relevant categories so that it may be used and protected more efficiently. Develops systems to make data easier to locate and retrieve. Tags data to make it easily searchable and trackable. Ensures compliance and data security.
Requirements:
Active Secret security clearance
Must have strong SQL experience with advanced queries
Experience with ETL and Data Pipelines
Experience with Databricks in AWS
Experience with Data Warehouse / Data Lakes a plus
Physical Requirements: These are the essential physical requirements needed to successfully perform the job.
Sedentary work.
Requires sitting up to 8 hours per day.
May require lifting up to 5 pounds unassisted.
Fine repetitive motor skills with hands, wrists, and fingers in coordination with eyes.
Hearing, speaking, and vision: Adequate to perform job duties and communicate in person, via video, and telephone. Includes reading information from printed sources and computer screens.
Other: Work may be performed in an office environment, which may involve frequent contact with staff and the public. Work may be stressful at times.
Paradyme Management, Inc. is committed to the full inclusion of all qualified individuals. In keeping with our commitment, Paradyme will take the steps to ensure that people with disabilities are provided reasonable accommodations. Accordingly, if a reasonable accommodation is required to fully participate in the job application or interview process, to perform the essential functions of the position, and/or to receive all other benefits and privileges of employment, please contact Rose Luczak, Director of People Operations at *********************** or at **************
Paradyme is a federal contractor and an EEO and an Affirmative Action Employer. All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, pregnancy-related disability, physical or mental disability, genetic information, sexual orientation, marital status, familial status, personal appearance, occupation, citizenship, veteran or military status, gender identity or expression, or any other characteristic protected by federal, state or local law.
Data & Polling Scientist (Junior or Senior) | Political Data Firm
Remote Data Modeler Job
Job Title: Data & Polling ScientistLocation: Washington, D.C. or Remote*Organization: Political Data Firm
Overview: A leading political data firm is seeking a Data & Polling Scientist to develop advanced models predicting public opinion and political behaviors. Utilize your data analysis expertise to refine voter outreach strategies, design insightful surveys, and enhance analytics products. This position demands a robust understanding of political dynamics and a commitment to implementing data-driven solutions for real-world political campaigns.
Job Duties:
Develop models to predict public opinion and political behaviors.
Refine target universes for strategic voter outreach.
Design and analyze politically-informed surveys and experiments.
Conduct quantitative analysis on electoral and media usage data.
Create data visualizations to facilitate communication with clients and team members.
Enhance analytics products tailored to specific campaign needs.
Advise clients on data-driven campaign strategies, optimizing their impact.
Requirements:
Proficiency in R, SQL, and Python for data analysis.
Familiarity with dashboard tools like Tableau or PowerBI for data presentation.
Understanding of large language models (LLMs); able to apply them in a political data context.
Strong interest in political data and conservative campaign strategies.
Capable of independent problem-solving and efficient task management.
Keen attention to detail with an ability to follow precise instructions.
Knowledge of the Republican ecosystem is highly preferred.
Salary: $60,000 (Junior) - $100,000+ (Senior) based on experience, plus benefits.
*Note: While the position allows for remote work, especially for senior talent, priority will be given to candidates who are able to work out of the Washington, D.C. office.
Senior Data Science Analyst (Onsite in Richmond, VA 23219)
Data Modeler Job In Richmond, VA
Job Title: Senior Data Science Analyst
Job Duration: 12 Months
Only W2 Accepted
Candidate will go through the process of Export Control Clearance preference for US Permanent Residents (Green Card Holders) or US Citizens to be submitted.
Alternate weeks in Richmond VA office; other weeks remote. (5 days in office, 5 days remote, repeating). Local candidates strongly preferred No 100% remote
Overtime may be required based on project needs.
Qualifications
Education: Bachelors or higher preferred
Discipline: Computer Science, Information Systems, Mathematics
Skills and Experience
At least 5 years of experience in Data Science using R/Python etc. on Hadoop platform
Strong skills in statistical application prototyping with expert knowledge in R and/or Python development
Design machine learning projects to address business problems determined by consultation with business partners.
Work on a variety of datasets, including both structured and unstructured data.
Deep knowledge of machine learning, data mining, statistical predictive modeling, and extensive experience applying these methods to real world problems
Extensive experience in Predictive Modeling and Machine Learning: Classification, Regression & Clustering
Understanding and experience working on Big Data Ecosystems is preferred: Hadoop, HDFS, Hive, Sqoop, Spark: py Spark, SparkR, Spark SQL, Jupyter & Zeppelin notebooks
Understanding and/or Experience with data engineering is a plus
Experience with cloud technologies (AWS, Azure, GCP, Snowflake) is big plus
Create Interpretable visualizations that tell a story and paint a vision
Equipment Data Coordinator
Remote Data Modeler Job
Job Title: Equipment Data Coordinator
Helpful Tip: If you include a cover letter on why you want to learn more about manufacturing machinery & equipment, that will be very helpful to your application.
Company Overview: Founded in 1924, Surplus Record is the premier online marketplace for buying and selling used and surplus industrial equipment. With a trusted reputation spanning over 100 years, we connect buyers and sellers in industries ranging from machinery manufacturing to power generation. Our online platform & printed magazine specializes in used machinery, electrical equipment such as motors, transformers, circuit breakers to air compressors, saws, grinders, hydraulic presses and more.
Job Description: Surplus Record is seeking an Equipment Data Coordinator to join our team. This position is essential to maintaining the quality and accuracy of equipment listings on our platform. The Equipment Data Coordinator plays a key role in reviewing, refining, and categorizing machinery listings from customers to ensure they are properly displayed on our website. The role involves analyzing data, reworking listing titles for clarity and detail, mapping listings to appropriate equipment categories, and importing them into our database.
This is not a basic data entry role; it requires critical thinking, attention to detail, and the ability to learn and identify various machinery types. Proficiency in Excel and a willingness to understand the nuances of industrial equipment are essential. Training will be provided to help you develop the expertise needed to accurately identify and categorize machinery based on customer descriptions and photos.
Key Responsibilities:
Review and clean data submitted by equipment dealers to ensure accurate and categorized listings.
Utilize online searches and AI tools to enhance database quality.
Collaborate with the sales team to develop new categories and SEO terms for existing and new equipment categories based on data you work with.
Manage and monitor automation processes for adding equipment to the database, ensuring correct formatting, and accurate photos and descriptions.
Map listings to appropriate equipment categories in our database by analyzing data and photos.
Work independently and creatively to build continuous improvement of marketing operations with the Advertising team.
Using Excel for mass imports and changes utilizing excel formulas to mass clean data.
Qualifications:
Bachelor's degree in Communications, Technical Writing, Marketing, Advertising, Business, or related fields.
2+ years of experience in operations, data analysis, or marketing preferred.
Proficiency with Excel (ie: Vlookup formula, for example)
Knowledge of industrial, manufacturing, or equipment auctions is a plus.
Strong organizational skills and the ability to manage multiple tasks effectively.
Eagerness to learn and grow with one of the most well-known and established industrial marketing & advertising firms in the world.
Salary: $45k - $50k paid hourly + Yearly Bonus up to 10% of wages earned.
If you have a keen eye for detail, enjoy working with data, and are excited to learn about the manufacturing world, we'd love to hear from you! **NOTE** Job is currently on-site in our Chicago office. Company does offer up to 12 days a year to Work Remotely after 1 year. We also offer flex hours where you can choose hours you want to be in the office for**
Senior Data Scientist (Commercial Analytics)
Remote Data Modeler Job
One of the largest environmental services companies is seeking a highly skilled and experienced Senior Data Scientist to join their dynamic team. In this remote role, you will be pivotal in supporting the development and enhancement of commercial analytics initiatives that directly impact their pricing strategies, customer retention, and revenue optimization. You will work with cutting-edge data science techniques to drive the development of advanced pricing algorithms, customer segmentation models, churn prediction models, lifetime value models, and more.
As a Senior Data Scientist, you will be expected to apply your deep expertise in commercial analytics to extract meaningful insights from complex datasets, optimize business processes, and help inform strategic decisions that drive both operational efficiency and revenue growth.
Key Responsibilities:
Lead the development and implementation of advanced pricing algorithms that maximize revenue and ensure competitive positioning in the market.
Build and refine customer segmentation models that drive targeted marketing and customer retention strategies.
Develop and enhance churn prediction models to identify at-risk customers and inform proactive retention efforts.
Create lifetime value models to help optimize long-term customer relationships and guide pricing decisions.
Work closely with stakeholders in sales, marketing, and operations to understand business needs and provide actionable, data-driven insights that align with commercial goals.
Conduct deep-dive commercial analytics, identifying opportunities for revenue growth, cost reduction, and process optimization.
Leverage statistical, econometric, and machine learning techniques to analyze large, complex datasets and deliver insights that inform business strategies.
Effectively communicate complex data-driven findings and recommendations to both technical and non-technical stakeholders.
Qualifications:
A Master's degree or higher in Statistics, Econometrics, Data Science, or a related field.
At least 7 years of experience in commercial analytics, revenue modeling, customer segmentation, and advanced data science techniques.
Strong proficiency in data analysis tools and programming languages such as Python and SQL.
Expertise in machine learning, predictive modeling, and statistical techniques used in commercial analytics and pricing optimization.
Proven ability to apply data science techniques to drive business outcomes in a commercial setting.
Experience with customer lifetime value modeling, churn prediction, and revenue forecasting.
Ability to work with large datasets, perform data wrangling, and generate meaningful insights from complex data.
Strong problem-solving abilities and the capability to work autonomously in a remote environment.
Excellent communication skills, with the ability to translate complex technical concepts into actionable insights for non-technical business leaders.
This is a fully remote contract position that is approved until the end of the year with high probability to extend. If this sounds like this role could be a fit please apply!
Data Scientist, AI Engineer
Data Modeler Job In Norfolk, VA
Artificial Intelligence Integrator (Focus Wargaming)
• Working Location: Norfolk, VA, USA
• Security Clearance: National Secret clearance or higher
• Language: High proficiency level in English language
BACKGROUND:
Data Exploitation and Artificial Intelligence (AI) are essential elements of NATO's digital transformation, enabling faster, data-driven decision-making and operational efficiency. These technologies are crucial for building a more adaptive and responsive NATO, prepared to meet the challenges of Multi-Domain Operations (MDO) and a rapidly evolving security landscape. As NATO's command dedicated to future warfare, Allied Command Transformation (ACT) leads efforts to explore, develop, and integrate latest technologies into military capabilities to transform the Alliance.
Wargaming has been recognized by NATO as a critical enabler for future warfare development. It is essential for planning and decision-making, simulating complex military scenarios to provide strategic, operational, and tactical insights. ACT is delivering wargaming capabilities for NATO, employing it wargaming to understand military challenges and explore new technologies and strategies. ACT aims to refining NATO's use of wargaming by integrating AI, thereby enhancing its capabilities and accelerating AI adoption.
The AI Integrator will join the Data Science and AI Team, ACT's cross-functional hub for data science, data exploitation and Artificial Intelligence. This team facilitates collaboration, provide access to resources and expertise, ensuring efficient use of technologies. The AI Integrator will play a key role in supporting the seamless integration of AI into NATO's wargaming efforts and broader digital transformation initiatives.
EXPERIENCE AND EDUCATION:
Essential Qualifications/Experience:
• University degree in Data Science, Machine Learning (ML), Artificial Intelligence (AI), Computer Science, or a related field OR four years minimum professional experience in Data Science, ML, AI within the last 5 years
• Minimum of 4 years in the last 5 years working in data science, machine learning, or AI engineering in a professional environment (not including studies)
• Demonstrated experience (minimum of 3 years in the last 5 years) in integrating AI technologies into practical applications
• Proficiency in AI and machine learning frameworks and tools such as TensorFlow, PyTorch, scikit-learn
• Experience with generative AI models, in particular Large Language Models (LLMs)
• Demonstrated experience (minimum of 3 years in the last 8 years) in data collection, analysis, and visualization using tools such as Python, R, Tableau, or Power BI
• Understanding of wargaming principles and methodologies
• Ability to communicate complex concepts effectively to non-technical stakeholders
• Proven ability to work collaboratively in cross-functional and interdisciplinary teams
• Experience in managing AI-related projects, including planning, execution, and (analytical) reporting
• Experience in multimedia content such as short videos for educational or preparatory purposes, with a willingness to explore AI technologies for content generation
• Knowledge of data security principles and best practices
DUTIES/ROLE:
• Contribute to the integration of AI in NATO's warfare development efforts, particularly wargaming
• Collaborate with wargaming experts to identify key areas for AI enhancement
• Support the development of a roadmap for AI integration into NATO wargaming
• Implement data collection and analysis methods for wargaming datasets
• Develop real-time analytics and visualizations to support decision making during wargaming simulations
• Support human-machine teaming by integrating AI tools for enhanced collaboration
• Design and conduct experiments to test AI applications in wargaming scenarios
• Document and present findings from AI integration experiments
• Ensure data integrity and security in all AI-related wargaming activities, considering, for example, NATO's Principles of Responsible Use
• Provide technical guidance to wargaming experts on AI technologies
• Develop analytical reports summarizing AI-enhanced wargaming exercises
• Explore AI applications to automate administrative tasks
• Participate in workshops, conferences, and meetings to stay updated on AI technologies
• Support training development for NATO personnel on AI in wargaming
• Develop multimedia content, such as short videos for wargaming scenarios or preparatory materials and explore the use of AI technologies for video generation
• Willingness to travel up to 30 days per year for meetings, conferences and exercises
• Perform additional tasks as required by the COTR related to Data Science & AI integration
Data Scientist
Data Modeler Job In Tysons Corner, VA
Welcome to the MOMENTUM Family!
MOMENTUM is not just our company name; it is the highest value we deliver to our customers. We are a rapidly growing technology solutions company delivering innovative technology, engineering, and intelligence solutions across the DoD sector. The efforts of our high-capacity team ultimately strengthen our Nation and the warfighter.
Our team is dispersed throughout the US, which means we value the diversity and unique collaboration that's fostered throughout our team. We work incredibly hard for our customers and believe deeply in our core values. We're a high-energy, high-growth team and we love to win.
Data Scientist
The Data Scientist will work directly with data scientists, software engineers, and subject matter experts in the definition of new analytics capabilities able to provide our federal customers with the information they need to make proper decisions and enable their digital transformation.
Typical duties include developing and deploying machine learning algorithms for enterprise applications, assisting federal customers in building applications, contributing to new feature implementation, and managing multiple projects in an agile, quality-focused setting.
In this role, you will:
Research, design, implement, and deploy Machine Learning algorithms for enterprise applications.
Assist and enable federal customers to build their own applications.
Contribute to the design and implementation of new features.
Real passion for developing team-oriented solutions to complex engineering problems.
Thrive in an autonomous, empowering, and exciting environment.
Great verbal and written communication skills to collaborate multi-functionally and improve scalability.
Interested in committing to a fun, friendly, expansive, and intellectually stimulating environment.
Hands-on experience deploying and operating applications using IaaS and PaaS on major cloud providers, such as Amazon AWS, Microsoft Azure, or Google Cloud Services.
Experience with deep learning, natural language processing, computer vision, or reinforcement learning.
Conveys highly technical concepts and information in written form to technical and non-technical audiences.
The ability to work on multiple concurrent projects is essential. Strong self-motivation and the ability to work with minimal supervision.
Must be a team-oriented individual, energetic, result & delivery-oriented, with a keen interest in quality and the ability to meet deadlines.
Ability to work in an agile environment.
If you're suitable for this role, you have:
Secret Clearance or above.
MS or PhD in Computer Science, Electrical Engineering, Statistics, or equivalent fields.
Minimum 2 years relevant work experience.
Excellent programming skills in Python.
Applied Machine Learning experience (regression and classification, supervised, and unsupervised learning).
Strong mathematical background (linear algebra, calculus, probability, and statistics).
Experience with scalable ML (MapReduce, streaming).
Ability to drive a project and work both independently and in a team.
Smart, motivated, can-do attitude, and seeks to make a difference.
Excellent verbal and written communication.
To learn more about us, check out our website at ********************
MOMENTUM is an EEO/M/F/Veteran/Disabled Employer:
We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law.
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The qualifications listed above are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform essential functions.
Accommodations:
Consistent with the Americans with Disabilities Act (ADA) and Alabama civil rights law, it is the policy of Momentum to provide reasonable accommodation when requested by a qualified applicant or employee with a disability, unless such accommodation would cause an undue hardship. The policy regarding requests for reasonable accommodation applies to all aspects of employment, including the application process. If reasonable accommodation is needed, please include request when applying.
Data Scientist - Washington DC (NoVa) TS/SCI Cleared
Data Modeler Job In Alexandria, VA
About Us We are a leading analytics company leveraging AI to empower enterprises and government agencies with smarter, faster decision-making capabilities. Our innovative platform turns intricate data into practical insights with intuitive AI tools and advanced visualization capabilities, empowering users to make informed decisions seamlessly. Trusted by top defense agencies and Fortune 500 companies, we're scaling fast and looking for a Data Scientist (TS/SCI Cleared) to join our dynamic team.
What You'll Do:
Design, implement, and deliver AI-powered solutions tailored to customer needs using our advanced analytics tools.
Work closely with clients to guide AI solutions from initial concept through full-scale deployment.
Leverage big data tools and platforms, such as Databricks, to build scalable and efficient AI-driven solutions.
What You'll Bring:
Active TS/SCI security clearance and ability to work from an SCIF.
Based in Washington DC/Northern Virginia, with travel to client sites.
Degree in Computer Science or related field.
3+ years of Python coding (pandas, numpy, sklearn, TensorFlow, PyTorch, etc.).
Experience deploying machine learning models in production.
Proficiency with SQL/NoSQL databases, version control (e.g., Git), Docker, and Kubernetes.
Strong ownership, accountability, and communication skills.
Bonus Points For:
Experience leading projects in SCIF environments.
Cyber Analytics, PCAP, or network monitoring expertise.
Familiarity with Spark, Dask, Snowpark, Kafka, or task schedulers like Airflow and Celery.
Join us and be part of a mission-driven team at the forefront of AI innovation!
Looker / Data Engineer
Data Modeler Job In Norfolk, VA
At Life Protect 24/7, we're passionate about transforming data into actionable insights. We leverage cutting-edge technology to drive decision-making and deliver exceptional value to our clients. As we continue to grow, we're seeking a talented Looker/Data Engineer to join our team and play a pivotal role in our data engineering initiatives.
Job Description
We're looking for an experienced Looker/Data Engineer who excels in SQL and LookML to support our data-driven objectives. In this role, you'll be responsible for writing and optimizing SQL queries in BigQuery, developing and maintaining LookML models in Looker, and building insightful dashboards and looks. You will also support ETL processes and utilize Google Cloud Platform APIs to programmatically achieve data engineering tasks.
Benefits and Compensation
Competitive starting salary
401K with Company Match
Paid Time Off
Medical, Dental, Vision, AD&D, and Life Insurance
HSA Options
Fully stocked kitchen with snacks and coffee
On site company Cafe Bistro
Convenient access to walking trails and Norfolk Premium Outlets
Key Responsibilities
SQL Development: Write and optimize complex SQL queries in BigQuery to extract, transform, and load data efficiently
Looker Development: Design, build, and maintain LookML models in Looker to support data visualization and reporting needs
Dashboard and Look Creation: Develop and maintain interactive dashboards and reports in Looker to provide actionable insights for various stakeholders
ETL Support: Support and manage ETL processes to ensure timely and accurate data integration from multiple sources
Google Cloud Platform APIs: Utilize GCP APIs to programmatically automate and streamline data engineering tasks
Data Quality Assurance: Monitor and ensure data accuracy and integrity across all reports and dashboards
Collaboration: Work closely with cross-functional teams to understand data requirements and deliver solutions that meet business needs
Insight Derivation: Derive insights through meticulous analysis
Complex Data Models: Strong ability to understand large and complex data models
Preferred Qualifications
Proficiency with Git
Knowledge of data warehouse principles and methodologies
Experience in data science, advanced analytics, machine learning
Experience with general-purpose programming (e.g. Python, Java, Go), dealing with a variety of data structures, algorithms, and serialization formats
Experience GCP cloud services and data warehouse stores like BigQuery
Self-driven, highly motivated and able to learn quickly