Remote Finance Data Analyst: Analyze, Model, Summarize
Labelbox 4.3
Remote job
A leading analytics firm is seeking a Finance Associate to support analytical and operational finance work. This remote independent contractor role requires the review of financial datasets, assisting with model updates, and producing structured summaries. Ideal candidates will have strong analytical and spreadsheet skills and experience in finance or business operations. The position offers a flexible workflow with compensation of $45 to $90 per hour.
#J-18808-Ljbffr
$45-90 hourly 3d ago
Looking for a job?
Let Zippia find it for you.
Remote Data Analyst - Revenue Insights for B2B SaaS
Hockeystack, Inc.
Remote job
A cutting-edge data analytics firm located in San Francisco is looking for a customer-focused Data Analyst. In this role, you will partner with clients to answer critical business questions regarding revenue performance and marketing optimization. The ideal candidate will possess strong analytical thinking skills and be able to communicate insights effectively. This position offers both in-person and flexible remote options, with a competitive salary range between $40,000 to $120,000 USD depending on experience and qualifications.
#J-18808-Ljbffr
$40k-120k yearly 2d ago
Data Analyst - LLM Automation & Scoring (Remote)
Simera
Remote job
We are seeking a Data Analyst with hands‑on experience using Large Language Models (LLMs) such as ChatGPT, Claude, or similar AI tools to analyze, evaluate, and score data at scale. This role focuses on building automated workflows that feed structured and unstructured data into LLMs, interpret outputs, and convert insights into actionable results for business decision‑making.
Key Responsibilities
Use LLMs (ChatGPT, Claude, etc.) to analyze, categorize, and score datasets across various use cases.
Design and develop automated workflows to pipe data into LLM models and retrieve structured outputs.
Work with APIs, data pipelines, and automation tools to streamline LLM processing.
Build and maintain scalable processes for ongoing data ingestion, transformation, and evaluation.
Validate and refine AI outputs to ensure accuracy, consistency, and reliability.
Collaborate with cross‑functional teams (Product, Engineering, Ops) to integrate AI‑driven insights into business processes.
Develop documentation, frameworks, and best practices for AI data analysis and scoring processes.
Monitor model performance, troubleshoot issues, and propose improvements.
Required Qualifications
Proven experience working with LLMs (ChatGPT, Claude, Gemini, etc.) for data analysis or automation.
Strong background in data manipulation, processing, and interpretation.
Experience building automated workflows or pipelines (e.g., Python, APIs, Zapier, Airflow, or similar tools).
Ability to structure prompts, evaluate outputs, and optimize model performance.
Familiarity with structured and unstructured data formats (CSV, JSON, text, etc.).
Strong analytical mindset, attention to detail, and problem‑solving skills.
Excellent written and verbal communication skills.
Preferred Qualifications
Experience with NLP applications or AI‑driven analytics.
Background in analytics, data engineering, or automation projects.
Knowledge of SQL, Python, or other scripting languages.
Experience integrating AI solutions into business systems or platforms.
#J-18808-Ljbffr
$71k-107k yearly est. 1d ago
Informatica Developer
Soft Tech Consulting, Inc. 3.6
Remote job
MUST BE ABLE TO OBTAIN PUBLIC TRUST
MUST BE A US CITIZEN
REMOTE WORK FOR NOW, BUT COULD RETURN TO ONSITE ANYTIME THIS YEAR
Soft Tech offers competitive BENEFITS in the areas of: MEDICAL, DENTAL, VISION, 401K, Short Term Disability, Long Term Disability, Life Insurance, PTO, AND PAID HOLIDAYS
We are seeking a skilled Informatica Developer and data integration specialist with strong expertise in Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS). The ideal candidate will lead the migration of existing data integration workflows from an on-premise PowerCenter environment to the cloud-based IICS platform. This role involves designing, developing, and implementing data integration solutions while ensuring data accuracy, performance, and alignment with business needs.
Responsibilities
Design and developdata integration workflows, mappings, and transformations using Informatica Power Center and Informatica IICS.
Lead the migration of existing data integration workflows from an on-premise PowerCenter environment to the cloud-based IICS platform.
Collaborate with business analysts and stakeholders to gather and understand data requirements, translating them into efficient technical designs.
Build complex mappings to load data from multiple sources, incorporating functional logic to meet business needs.
Perform data validation and implement quality checks to ensure data accuracy and integrity.
Document data integration processes, workflows, and solutions comprehensively.
Troubleshoot and resolve issues related to data integration, ensuring optimal performance and scalability.
Qualifications
Minimum of 3 years of experience in data integration and ETL development with Informatica Power Center and Informatica IICS.
Strong expertise in designing and implementing data workflows, mappings, and transformations, particularly in leading migrations to cloud-based platforms.
Proficiency in relational databases, SQL, and data modeling concepts.
Advanced knowledge of Sybase, PostgreSQL, and Oracle.
Familiarity with AWS cloud platforms and datawarehousing solutions.
Proven ability to work independently and collaboratively in a fast-paced environment.
About Us
Soft Tech Consulting, Inc. is a woman and minority-owned business headquartered in Chantilly, VA. With contracts in both the public and private sectors in the DC metro area and across the country, Soft Tech is an organization made up of highly successful and talented Information Technology professionals offering enterprise class solutions for any size organization at great value. Soft Tech's mission is to help government organizations design, implement, and maintain mission critical Information Technology solutions. By focusing jointly on our employees and our customers, we are able to achieve our mission by providing each and every one of our customers with continuous quality customer support.
Soft Tech Consulting, Inc. is an Equal Opportunity Employer.
#J-18808-Ljbffr
$87k-114k yearly est. 3d ago
Petabyte-Scale Data Ingestion Architect (On-Prem/Hybrid)
Kubelt
Remote job
A technology company in San Francisco seeks an expert to design and implement a data pipeline for large-scale microscopy datasets. This role involves architecting storage solutions, ensuring system performance at petabyte scale, and requires extensive experience in high-throughput storage technologies and data integrity. Candidates must be present on-site during the build-out phase and have a solid background in HPC pipelines and networking. Competitive compensation is offered at $100-300/hour.
#J-18808-Ljbffr
$118k-166k yearly est. 5d ago
Staff Data Analyst - Remote Data Cleanup & Insights Lead
Findem
Remote job
A technology company in San Francisco is seeking a seasoned Staff Data Analyst to lead data cleanup initiatives across Engineering and Research. You will identify opportunities to improve data quality and leverage your expertise in Python and SQL to transform and cleanse data. With 8-12 years of experience in data analytics, your role includes developing solutions for complex data issues and managing projects across multiple teams. This position offers competitive compensation and generous benefits including unlimited PTO and equity grants.
#J-18808-Ljbffr
$107k-152k yearly est. 3d ago
Staff Data Platform Engineer Hybrid - New York City, San Francisco
Vercel.com 4.1
Remote job
About Vercel:
Vercel gives developers the tools and cloud infrastructure to build, scale, and secure a faster, more personalized web. As the team behind v0, Next.js, and AI SDK, Vercel helps customers like Ramp, Supreme, PayPal, and Under Armour build for the AI-native web.
Our mission is to enable the world to ship the best products. That starts with creating a place where everyone can do their best work. Whether you're building on our platform, supporting our customers, or shaping our story: You can just ship things.
About the Role
We are looking for a Principal Engineer to lead the architecture and development of Vercel's next-generation Data Platform. You will design the systems that power data across our products and internal teams, enabling real-time analytics, observability, and future AI/ML capabilities.
This role combines hands-on engineering with technical leadership. You will set the vision for our data ecosystem, build scalable distributed systems using technologies like Kafka, ClickHouse, Tinybird, and Snowflake, and work across the company to align data strategy with product and engineering goals.
What You Will Do
Architect and build a scalable data platform for batch and real-time workloads.
Design streaming and event-driven systems using Kafka and related tooling.
Implement modern lakehouse foundations, including Iceberg-based storage and governance.
Partner with engineering, product, and leadership to define data strategy and technical direction.
Establish best practices for ingestion, modeling, storage, quality, and security.
Write production-grade code and set engineering standards across the team.
Improve reliability through strong observability, monitoring, and fault tolerance.
Drive long-term architectural decisions and evaluate build-vs-buy tradeoffs.
Support analytics and ML teams by delivering high-quality data systems and tooling.
About You
8+ years in data engineering or data architecture, including senior/principal-level work.
Deep experience designing and operating large-scale distributed data systems.
Strong expertise with Kafka and analytics/lakehouse technologies (ClickHouse, Tinybird, Snowflake, Iceberg).
Strong cloud experience (AWS, GCP, or Azure).
Experience with data governance, reliability, and secure data operations.
Excellent communication skills and ability to influence across engineering and product teams.
Demonstrated leadership through mentorship, design ownership, or technical direction.
Benefits
Competitive compensation package, including equity.
Inclusive Healthcare Package.
Learn and Grow - we provide mentorship and send you to events that help you build your network and skills.
Flexible Time Off.
We will provide you the gear you need to do your role, and a WFH budget for you to outfit your space as needed.
The San Francisco, CA base pay range for this role is $196,000‑$294,000. Actual salary will be based on job‑related skills, experience, and location. Compensation outside of San Francisco may be adjusted based on employee location. The total compensation package may include benefits, equity‑based compensation, and eligibility for a company bonus or variable pay program depending on the role. Your recruiter can share more details during the hiring process.
Vercel is committed to fostering and empowering an inclusive community within our organization. We do not discriminate on the basis of race, religion, color, gender expression or identity, sexual orientation, national origin, citizenship,, marital status, veteran status, disability status, or any other characteristic protected by law. Vercel encourages everyone to apply for our available positions, even if they don't necessarily check every box on the job description.
#J-18808-Ljbffr
$196k-294k yearly 1d ago
Senior Data Analyst | AI & Healthcare Data, Remote US
Doximity, Inc. 3.4
Remote job
A leading healthcare technology company is seeking a Data Scientist in San Francisco to optimize AI products and create analytics for medical professionals. Candidates should have over 5 years of experience and advanced knowledge in statistical concepts and machine learning. This role offers a competitive salary between $170,000 and $248,000, plus comprehensive benefits including health offerings and wellness programs. Join us to make an impact in the healthcare industry.
#J-18808-Ljbffr
$170k-248k yearly 4d ago
Remote Senior Data Engineer - AI-Powered Pipelines
Altimate.Ai
Remote job
A leading AI-driven data solutions company in California is seeking a Senior Software Engineer to enhance AI capabilities and build scalable data infrastructures. You will design robust data systems and integrate advanced AI solutions for data teams. Ideal candidates have strong backgrounds in Python, SQL optimization, and experience with cloud platforms. This role offers competitive salary and opportunities for professional development.
#J-18808-Ljbffr
$110k-156k yearly est. 1d ago
Staff Data Engineer
Gemini 4.9
Remote job
About the Company
Gemini is a global crypto and Web3 platform founded by Cameron and Tyler Winklevoss in 2014, offering a wide range of simple, reliable, and secure crypto products and services to individuals and institutions in over 70 countries. Our mission is to unlock the next era of financial, creative, and personal freedom by providing trusted access to the decentralized future. We envision a world where crypto reshapes the global financial system, internet, and money to create greater choice, independence, and opportunity for all - bridging traditional finance with the emerging cryptoeconomy in a way that is more open, fair, and secure. As a publicly traded company, Gemini is poised to accelerate this vision with greater scale, reach, and impact.
The Department: Data
At Gemini, our Data Team is the engine that powers insight, innovation, and trust across the company. We bring together world‑class data engineers, platform engineers, machine‑learning engineers, analytics engineers, and data scientists - all working in harmony to transform raw information into secure, reliable, and actionable intelligence. From building scalable pipelines and platforms, to enabling cutting‑edge machine learning, to ensuring governance and cost efficiency, we deliver the foundation for smarter decisions and breakthrough products. We thrive at the intersection of crypto, technology, and finance, and we're united by a shared mission: to unlock the full potential of Gemini's data to drive growth, efficiency, and customer impact.
The Role: Staff Data Engineer
The Data team is responsible for designing and operating the data infrastructure that powers insight, reporting, analytics, and machine learning across the business. As a Staff Data Engineer, you will lead architectural initiatives, mentor others, and build high‑scale systems that impact the entire organization. You will partner closely with product, analytics, ML, finance, operations, and engineering teams to move, transform, and model data reliably, with observability, resilience, and agility.
This role is required to be in person twice a week at either our San Francisco, CA or New York City, NY office.
Responsibilities
Lead the architecture, design, and implementation of data infrastructure and pipelines, spanning both batch and real‑time / streaming workloads
Build and maintain scalable, efficient, and reliable ETL/ELT pipelines using languages and frameworks such as Python, SQL, Spark, Flink, Beam, or equivalents
Work on real‑time or near‑real‑time data solutions (e.g. CDC, streaming, micro‑batch) for use cases that require timely data
Partner with data scientists, ML engineers, analysts, and product teams to understand data requirements, define SLAs, and deliver coherent data products that others can self‑serve
Establish data quality, validation, observability, and monitoring frameworks (data auditing, alerting, anomaly detection, data lineage)
Investigate and resolve complex production issues: root cause analysis, performance bottlenecks, data integrity, fault tolerance
Mentor and guide more junior and mid‑level data engineers: lead code reviews, design reviews, and best‑practice evangelism
Stay up to date on new tools, technologies, and patterns in the data and cloud space, bringing proposals and proof‑of‑concepts when appropriate
Document data flows, data dictionaries, architecture patterns, and operational runbooks
Minimum Qualifications
8+ years of experience in data engineering (or similar) roles
Strong experience in ETL/ELT pipeline design, implementation, and optimization
Deep expertise in Python and SQL writing production‑quality, maintainable, testable code
Experience with large‑scale datawarehouses (e.g. Databricks, BigQuery, Snowflake)
Solid grounding in software engineering fundamentals, data structures, and systems thinking
Hands‑on experience in data modeling (dimensional modeling, normalization, schema design)
Experience building systems with real‑time or streaming data (e.g. Kafka, Kinesis, Flink, Spark Streaming), and familiarity with CDC frameworks
Experience with orchestration / workflow frameworks (e.g. Airflow)
Familiarity with data governance, lineage, metadata, cataloging, and data quality practices
Strong cross‑functional communication skills; ability to translate between technical and non‑technical stakeholders
Proven experience in mentoring, leading design discussions, and influencing data‑engineering best practices across teams
Preferred Qualifications
Experience with crypto, financial services, trading, markets, or exchange systems
Experience with blockchain, crypto, Web3 data - e.g. blocks, transactions, contract calls, token transfers, UTXO/account models, on‑chain indexing, chain APIs, etc.
Experience with infrastructure as code, containerization, and CI/CD pipelines
Hands‑on experience managing and optimizing Databricks on AWS
It Pays to Work Here
The compensation & benefits package for this role includes:
Competitive starting salary
A discretionary annual bonus
Long‑term incentive in the form of a new hire equity grant
Comprehensive health plans
401K with company matching
Paid Parental Leave
Flexible time off
Salary Range
The base salary range for this role is between $168,000 - $240,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate's compensation, we consider a number of factors including skillset, experience, job scope, and current market data.
In the United States, we offer a hybrid work approach at our hub offices, balancing the benefits of in‑person collaboration with the flexibility of remote work. Expectations may vary by location and role, so candidates are encouraged to connect with their recruiter to learn more about the specific policy for the role. Employees who do not live near one of our hubs are part of our remote workforce.
At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.
#J-18808-Ljbffr
$168k-240k yearly 4d ago
Remote Full-Stack Engineer: Data & AI Platform
Cube Dev, Inc.
Remote job
A cutting-edge tech startup is looking for a Full-Stack Engineer to work on its Cube Cloud platform. You will play a vital role in building user-friendly interfaces and robust backend APIs. With a remote work policy, this role offers the chance to contribute to innovative analytics-driven products while collaborating closely with cross-functional teams. The ideal candidate should have strong experience with JavaScript/TypeScript, modern frameworks, and backend technologies, coupled with good communication skills for a product-oriented mindset.
#J-18808-Ljbffr
$110k-157k yearly est. 3d ago
Hybrid AI Systems & Data Engineer - Databricks & LLM
Hyperfi
Remote job
An innovative tech firm located in the heart of California is looking for an AI Systems & Data Engineer. This role involves designing and building data pipelines in Databricks, constructing retrieval-augmented generation systems, and working closely with a talented team to integrate AI functionalities. Ideal candidates will have strong experience with Python, LangChain, and Databricks, as well as a passion for tackling complex technological challenges. Enjoy flexible hours in a hybrid work environment.
#J-18808-Ljbffr
$110k-157k yearly est. 3d ago
Senior Platform Engineer - Distributed Data Engine (Remote)
Pocus
Remote job
A dynamic tech startup in San Francisco is seeking a Senior Engineer to join the core platform team. The ideal candidate will build a reliable and extensible distributed data platform using AWS, Kubernetes, and Docker. Experience in high-scale data systems is essential as is a collaborative spirit in a remote-first environment. Join us to be part of our mission to revolutionize data usage for go-to-market teams.
#J-18808-Ljbffr
$110k-157k yearly est. 4d ago
Remote Senior Robotics Data Curation Engineer
Foxglove
Remote job
A technology company is seeking a highly skilled Product Engineer to join their robotics observability platform. This role involves designing user interfaces, backend services, and data processing for large volumes of sensor data. Candidates should have a minimum of 5 years in high-performance software, experience with REST APIs, and familiarity with technologies like TypeScript and Rust. The position is remote-friendly, with a supportive work environment and strong team dynamics.
#J-18808-Ljbffr
$110k-157k yearly est. 3d ago
Platform Engineer, Open-Source Data Engine (Hybrid SF)
DRH Search
Remote job
A well-funded data platform startup is seeking a software engineer to contribute to their open-sourced data engine. The hybrid role involves working in the SF office three days a week, focusing on both frontend and backend development. Candidates should have 3+ years of software engineering experience and strong skills in React, FastAPI, and Python, with a background in early-stage startups being advantageous. This is a great opportunity to help shape a product with a vision for the best distributed query engine.
#J-18808-Ljbffr
$110k-157k yearly est. 1d ago
Remote Lead Data Analyst & Analytics Strategist
Nerdwallet, Inc. 4.6
Remote job
A prominent financial technology firm is seeking a Lead Data Analyst responsible for driving advanced analytics across product and business domains. This role involves leading analytics initiatives, designing measurement strategies, and advising executives on data strategy. The ideal candidate has over 5 years of experience and is proficient in SQL and Looker. The position is remote, offering great benefits to ensure employees' well-being and work-life balance.
#J-18808-Ljbffr
$125k-164k yearly est. 5d ago
Senior Azure Data Engineer
Kenexai
Remote job
Kenexai delivers smart, data-driven solutions that empower businesses across industries. Our mission is to combine deep, domain-specific expertise with cutting-edge technology to drive meaningful impact. With a trusted team, consistent quality, and a growing global presence, we remain committed to delivering excellence while staying true to our core values: innovation, integrity, and client success.
Be part of a team that's not just building solutions, but shaping the future with intelligence.
Experience: 4+ YearsJob Type : Full TimeRoles and Responsibility :
We are seeking a Senior Azure Data Engineer to design, build, and maintain scalable data solutions on Microsoft Azure. In this role, you will be responsible for creating robust data pipelines, optimizing data architectures, and enabling advanced analytics across the organization. You'll play a key role in modernizing our data platform and supporting data-driven decision-making.
Design and implement ETL/ELT pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse Analytics.
Build and maintain data lakes, datawarehouses, and lakehouse architectures using Azure Data Lake Storage (ADLS) and Delta Lake.
Develop and optimize data workflows for performance, scalability, and reliability.
Implement data ingestion from various sources including APIs, databases, cloud services, and flat files.
Automate infrastructure and pipeline deployment using Azure DevOps, ARM templates, Terraform, or Bicep.
Monitor and manage pipeline health, cost, and performance using Azure Monitor, Log Analytics, and Application Insights.
Integrate and support data security, governance, and compliance using Azure Purview, RBAC, and Data Loss Prevention (DLP) tools.
Work with data scientists, analysts, and business teams to deliver clean, reliable, and timely data.
Collaborate with architects to define standards and implement best practices in Azure-based data engineering.
Mentor junior engineers and participate in code reviews, architecture reviews, and design discussions.
Skill Requirement:Must-Have:
4+ years of experience in data engineering.
Proficiency with Azure Data Factory, Azure Synapse, Databricks, and ADLS Gen2.
Strong SQL and Python or PySpark skills.
Experience with Delta Lake, Parquet, Spark-based data processing.
Hands-on experience with CI/CD pipelines, version control (Git), and infrastructure-as-code tools.
Understanding of data modeling, data governance, and security best practices in Azure.
Nice-to-Have:
Familiarity with Power BI, Azure ML, or Azure Stream Analytics.
Experience with event-driven architectures (e.g., Event Grid, Event Hubs).
Azure certifications (e.g., DP-203, AZ-900, AZ-104) are a plus.
Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field.
Why Kenexai?
Be part of building something from the ground up in a high-growth, high-impact domain
Work alongside passionate experts in AI, data, and industry consulting
Competitive base + uncapped commission structure tied directly to performance
Remote-first flexibility with real ownership and career growth potential
How Can We Help?Perks and Benefits:
Employees are entitled to flexible working hours to support work-life balance.
The company operates on a 5-day work week.
A healthy, inclusive, and collaborative work environment is maintained.
The company organizes Fun Fridays and festive celebrations to foster team spirit.
Employees have access to opportunities for continuous learning and career growth.
An annual company trip is organized for team building and relaxation.
Comprehensive medical insurance benefits are provided to employees.
Performance-based bonuses and annual salary revisions are offered.
A hybrid working model is available, allowing a mix of in-office and remote work as per company policy.
Excellent to work with in every way. Proactively identified solutions to the problem in the initial design and the recommended solutions. Work has top-notch. Results delivered on time. Communication was excellent.
Real pleasure consulting with Kenexai to set up our company's entire datawarehouse and dashboards on AWS. I will definitely be reaching out to them for future work to be done. Our project was effective and 100% achieved what I planned to do in the beginning, in a shorter time frame and with less effort than I expected.
CCR Data perform complex data migrations, we needed and extra pair of hands to restore an Oracle database and transfer the data to a Microsoft SQL database ready for our migration analysts to do their stuff. Nitesh was able to make available at short notice one of their team who was able to see the project through from start to finish. We would not hesitate in recommending or using Kenexai again and would be happy to outsource bigger projects to them in the future.
I have used RA on numerous occasions over the past 2 years, specifically with Nitesh Solanki for the delivery on PDI ETL jobs. I am very happy with him and the high level of quality work he has provided. He seems to be available all the time and works extremely hard to deliver high quality solutions.
Mark Scriven Technical Director - Value Ad
Working with Kenexai was a game-changer for us. Thanks to Nitesh from Kenexai, our data strategy is on point and giving our business a major boost!
Jason Wood Auto Finance Company
They took us from square one, building a smart data strategy - everything from collecting data to dishing out real-time insights. With their help, we've seen some major improvements. We would give them a thumbs-up for anything data-related.
Eric A. Real-estate Company
Kenexai has made a real difference for our insurance firm. Their know-how in fraud detection is top-notch. Their data strategies have been a big help, and we are seeing great results. We are quite pleased with what they've done for us
Sharon White Insurance Firm
They truly understand what they do. Their restaurant analytics provide real-time insights into our operations and customer behaviors, and it has made a significant difference in our business.
Working with Kenexai has been fantastic! Thanks to their AI and ML-powered solution, we've made great progress. Their expertise helped us spot and prevent fraud in rentals and make us trustworthy.
#J-18808-Ljbffr
$75k-100k yearly est. 2d ago
Remote SDR Growth Leader | Scale Global Sales Development
Influxdata 4.3
Remote job
A leading technology company is seeking an experienced SDR Leader to manage and grow their Sales Development team. This position involves developing strategies to meet sales goals, fostering a high-performance culture, and supporting SDRs in their professional growth. Candidates should have 3 to 6 years of experience in sales development, with at least 3 years in a leadership role. The company offers competitive benefits including medical insurance, flexible time off, and a supportive work environment.
#J-18808-Ljbffr
$124k-176k yearly est. 2d ago
Remote Developer Relations Leader for Crypto Portfolios
P2P 3.2
Remote job
Developer Relations - Dragonfly Portfolio
Remote • Portfolio • Full-time
Dragonfly is a global, crypto-native venture capital / research firm with $2B+ in assets under management and 160+ portfolio companies, including many of the leading teams in the space.
This is an application to join our talent network. This is not a listing for an internal role at Dragonfly.
We want to connect with Developer Relations candidates of all experience levels, backgrounds, and specializations. Our portfolio is globally distributed, with remote, hybrid, and in-person roles worldwide, building across the entire crypto ecosystem.
Process
We'll confidentially match you with portfolio companies aligned with your background, skill set, and interests.
If mutual interest exists between you and a team, we'll facilitate a warm introduction.
If there isn't a match today, we'll keep you top of mind for opportunities in the future.
Apply
Submit your information below, and we'll reach out if there are any next steps.
Ready to apply?
Form fields
First name *
Last name *
Email *
LinkedIn URL
Phone number
Location *
Resume *
Click to upload or drag and drop here
Where are you legally authorized to work?
Crypto experience?
If you have experience in crypto, note it below (professional or personal). If none, write "N/A".
What type of roles are you interested in?
What types of projects within crypto/web3 are you interested in?
By applying you agree to Gem's terms and privacy policy.
Save your info to apply to other roles faster and help employers reach you.
#J-18808-Ljbffr
$124k-176k yearly est. 4d ago
Lead, Developer Experience & Content Enablement
Creativefuego
Remote job
A technology platform company in San Francisco seeks a Developer Relations Lead for Content & Enablement. This role involves creating and managing documentation and resources for developers, collaborating with product and engineering teams. The ideal candidate has expertise in technical writing, API design, and content strategy, as well as experience in engaging developer communities. Preference for working in the San Francisco office, with options for remote work within PST time.
#J-18808-Ljbffr
Work from home and remote data warehouse developer jobs
Nowadays, it seems that many people would prefer to work from home over going into the office every day. With remote work becoming a more viable option, especially for data warehouse developers, we decided to look into what the best options are based on salary and industry. In addition, we scoured over millions of job listings to find all the best remote jobs for a data warehouse developer so that you can skip the commute and stay home with Fido.
We also looked into what type of skills might be useful for you to have in order to get that job offer. We found that data warehouse developer remote jobs require these skills:
Etl
Java
Hadoop
Data warehouse
Visualization
We didn't just stop at finding the best skills. We also found the best remote employers that you're going to want to apply to. The best remote employers for a data warehouse developer include:
Insight Enterprises
Inovalon
CO-OP Financial Services
Since you're already searching for a remote job, you might as well find jobs that pay well because you should never have to settle. We found the industries that will pay you the most as a data warehouse developer:
Insurance
Manufacturing
Health care
Top companies hiring data warehouse developers for remote work
Most common employers for data warehouse developer