FIND PERSONALIZED JOBS
Sign up to Zippia and discover your career options with your personalized career search.
Sorry, we can't find that. Please try a different city or state.
APPLY NOW
Apply Now
×
FIND
PERSONALIZED JOBS

Sorry, we can't find that. Please try a different city or state.

CONTENT HAS
BEEN UNLOCKED
Close this window to view unlocked content
or
find interesting jobs in

Log In

Log In to Save

Sign Up to Save

Sign Up to Dismiss

Sign Up

SIGN UP TO UNLOCK CONTENT

or

The email and password you specified are invalid. Please, try again.

Email and password are mandatory

Forgot Password?

Don't have an account? Sign Up

reset password

Enter your email address and we will send you a link to reset your password.

Back to Log In

Log In

Log In to Save

Sign Up to Save

Sign Up to Dismiss

Sign up to save the job and get personalized job recommendations.

Sign up to dismiss the job and get personalized job recommendations.

or

The email and password you specified are invalid. Please, try again.

Email and password are mandatory

Already have an account? Log in

reset password

Enter your email address and we will send you a link to reset your password.

Back to Log In

Company Saved

Answer a few questions and view jobs at that match your preferences.

Where do you want to work?

Job Saved

See your Saved Jobs now

or

find more interesting jobs in

Job Dismissed

Find better matching jobs in

Your search has been saved!

Top 50 Data Scientist Skills

Below we've compiled a list of the most important skills for a Data Scientist. We ranked the top skills based on the percentage of Data Scientist resumes they appeared on. For example, 8.1% of Data Scientist resumes contained R as a skill. Let's find out what skills a Data Scientist actually needs in order to be successful in the workplace.

These Are The Most Important Skills For A Data Scientist

1. R
demand arrow
high Demand
Here's how R is used in Data Scientist jobs:
  • Developed promotional pricing optimization procedures.
  • Developed analyses to differentiate oilfield water utilization and production trends by their corresponding geological formation in the greater Permian basin.
  • Collaborated with product management and engineering departments to understand every California Community College needs and devised possible solutions.
  • Improved statistical models performance by using leaning curves, feature selection methods and regularization.
  • Improved communication and presentation skills by participating in weekly VP meetings.
  • Implemented Principal Component Analysis and Liner Discriminate Analysis.
  • Managed project development and resolved issues.
  • Market basket analysis: applied association rules to transaction data and set up the dashboard in R Shiny.
  • Build a model to predict the likelihood of approval of a video by content management.
  • Quantified and modeled equations for produced water in the Texas and New Mexico regions.
  • Focused on applying data mining and machine learning procedures to predict future parking occupancy.
  • Focused on applying data mining and machine learning procedures to predict customer churn.
  • Worked as the Lead Data Scientist in analyzing oilfield water for trends.
  • Worked with the team that deals with investment and mortgage approvals.
  • Gained experience in personnel management, technical writing, project planning.
  • Extracted and analyzed large Telemetry datasets to derive insights on product adoption using data collector architecture using flume and Pig.
  • Build a visualization tool to explore the relationship between different channels and ranking them by incoming links from other channels.
  • Developed an application to create a user interface for identifying teachers overcharging money for GRE/ GMAT classes 2.
  • Developed Predictive modelling to derive customer health score to automate the call to actions.
  • Developed Predictive modelling to derive insights on Cross sell and Upsell Opportunities.

Show More

2. Data Analysis
demand arrow
high Demand
Here's how Data Analysis is used in Data Scientist jobs:
  • Design and develop new algorithms for data analysis and prediction; application to battery Performance data.
  • Designed, developed and documented logic for algorithms for data analysis and presentation.
  • Developed new methods and techniques for accurate data analysis and interpretation.
  • Conducted data preparation and exploratory data analysis for feature selection.
  • Performed graph data analysis to provide insights on competitors.
  • Utilized various data analysis and machine learning techniques to extract useful information from large data sets relevant to the mobile app marketplace
  • Programmed in R to perform data analysis for projects in support of the Centers for Disease Control and Prevention.
  • Discovered through exploratory data analysis over 20% of breakdown occurred +/- 7 days of scheduled routine maintenance.
  • Performed thorough testing and validation of models and support various aspects of the business with data analysis.
  • Created web-based applications, using R Shiny, as tools for data analysis and visualization.
  • Pioneered use of advanced DBMS for big data analysis in the company.
  • Respond to ad hoc requirements for data analysis in high paced investigation environment, including queries and visualization applications.
  • Developed C/C++ implementations of some medical data analysis algorithms that were in Matlab.
  • Applied statistical/machine learning to perform data analysis (R/Matlab//Python/Perl/ Tableau/MicroStrategy).
  • Self-directed research applying data analysis and transformation techniques from spacecraft including Voyager and the Van Allen Probes.
  • Developed statistical learning models for data analysis using SAS, R Data quality control and extraction.
  • Web development and data analysis in Clojure and Datomic
  • Developed framework for Big Data analysis using Pig, Hadoop, R and traditional RDBMs.
  • Executed ad-hoc data analysis for customer insights using SQL using Amazon AWS Hadoop Cluster.
  • Assisted in launching the startup APP by content marketing, product testing and data analysis.

Show More

1,378 Data Analysis Jobs

No jobs at selected location

3. Pl/Sql
demand arrow
high Demand
Here's how Pl/Sql is used in Data Scientist jobs:
  • Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
  • Designed and Developed Oracle11g, PL/SQL Procedures and UNIX Shell Scripts for Data Import/Export and Data Conversions.
  • Created SQL tables with referential integrity and developed queries using SQL, SQL*PLUS and PL/SQL.
  • Created stored procedures using PL/SQL and tuned the databases and backend process.
  • Designed and created backend data access modules using PL/SQL stored procedures.

Show More

4. Python
demand arrow
high Demand
Here's how Python is used in Data Scientist jobs:
  • Improved curriculum materials in python, machine learning and statistical inference.
  • Trained neural network model for prediction of approval of credit card for a given client using python libraries.
  • Used Python as a programming tool to analyze public MTA turnstile data to make an informed decision.
  • Developed a distributed file system to perform merchant tagging for 100Mil+ daily transactions (Python).
  • Cross-trained members of other teams in SQL/Python/R and in producing various audience profiling-related outputs.
  • Develop Python and SQL code to query, process, and analyze data.
  • Scraped raw data from different websites using Python for projects use.
  • Instructed colleagues in Python and modules relevant to the research.
  • Web scraped an extensive amount of data with Python.
  • Develop data science tools and capabilities in Python.
  • Write code in Python, R, and Pig.
  • Implemented production machine learning models in automated pipelines including C, C++, and spark (python and Scala).
  • Participated in an intensive statistics, visualization, and machine learning training in both R and Python.
  • Queried data from a SQL database with a very complex data model, using Python and Django.
  • Transitioned Python prototype to C# production environment hosted on Azure.
  • Lead data team infrastructure for business intelligence reporting and analysis in python and Redshift.
  • Completed a 12-week immersive data science bootcamp focused on python programming, machine learning, statistics and visualization.
  • Leveraged Python to create customized API s for geolocation targeting from images and text in pdf files.
  • Created pygame using python and extracted and manipulated cloud base dataset using MySQL.
  • Developed api using Node.js and Python.

Show More

1,819 Python Jobs

No jobs at selected location

5. Analytics
demand arrow
high Demand
Here's how Analytics is used in Data Scientist jobs:
  • Design and build custom investment analytics for institutional clients using their data assets and our proprietary statistical engines.
  • Evaluated potential vendors and products to enable continued maturity of analytics tools and safeguard data integrity.
  • Designed and applied statistical and mathematical methods for corporate analytics that were implemented into client-facing products.
  • Created visibility by developing an interactive dashboard and designed a Patient Access analytics tracker.
  • Designed predictive analytics algorithms for ad-targeting, product recommendations, and buy/no-buy.
  • Design algorithms for competitive analytics based on click-stream and review data.
  • Coordinate tactical-related business requests for Data Analytics development enhancements.
  • Help the merchants improve their profits by using Data Analytics to understand trends in the data.
  • Work with clients from wide range of industries to refine analytics requirements and create project proposals.
  • Maintain consistent data ingestion from sources for data analytics vision, strategy and programs.
  • Created standard framework and operating rhythms as well as other analytics related resources.
  • Developed text analytics program with a team of developers for market research.
  • Build data analytics program in the domain of commercial real estate.
  • Worked on establishing new Analytics team within the Enterprise.
  • Understand the entire customer journey with native text analytics.
  • Engineered a real-time analytics news feed.
  • Provide reference architecture insights and designs to partner analytics and engineering teams.
  • Extracted Information Analytics results from GSA filesystem.
  • Utilized freemium web analytics services offered by Google Analytics for reporting.
  • Led Surescripts data intelligence practice with cross-cutting Business and Functional focus on Data analytics, Analytics Systems Integration and Business Analytics.

Show More

2,498 Analytics Jobs

No jobs at selected location

Job type you want
Full Time
Part Time
Internship
Temporary
6. Algorithms
demand arrow
high Demand
Here's how Algorithms is used in Data Scientist jobs:
  • Implemented algorithms to analyze credit card purchases in order to provide specialized recommendation to customers based on their purchase history.
  • Worked on, multiple projects to leverage statistical learning/machine learning algorithms to automate Alternate Asset Servicing.
  • Refine and improve processes and algorithms with technical input from investigative analysts.
  • Designed experiments to test algorithms that hypothesize consumer behavior.
  • Performed statistical modeling validation on proprietary machine learning algorithms.
  • Developed predictive algorithms for supervised learning.
  • Learned data processing techniques and machine learning algorithms for regression, classification, and clustering problems.
  • Developed custom software to enhance the outcome of machine learning algorithms.
  • Implemented statistical learning algorithms to predict the demand of finished goods.
  • Developed algorithms to detect fraudulent users in client programs.
  • Adapted algorithms to suit my problem and evaluated results.
  • Increased the veracity of machine learning algorithms.
  • Used ML algorithms to predict customer growth.
  • Developed mathematical optimization models and exact/heuristic solution algorithms to scheduling testing tasks on prototype vehicles during new vehicle development.
  • Developed software modules with text searching algorithms to discover critical gene sequences for cardiovascular diseases.
  • Designed the implementation of ensemble-based algorithms for targeted promotion distribution.
  • Designed weight algorithms and used clustering algorithms to make proper recommendations to users 4.
  • Machine learning algorithms such as Decision tree, Na ve bayes.
  • Have trained complex learning algorithms on large and intricate datasets.
  • Leveraged supervised machine learning algorithms to classify unknown text (Multinomial Naives Bayes,

Show More

1,064 Algorithms Jobs

No jobs at selected location

7. Hadoop
demand arrow
high Demand
Here's how Hadoop is used in Data Scientist jobs:
  • Implemented distributed algorithms in the Hadoop environment using Hive MapReduce.
  • Experience in developing custom Map Reduce Programs in Java using Apache Hadoop for analyzing Big Data as per the requirement.
  • Worked on ingesting company data into Hadoop to form a data lake and analyzing it using DDL statements.
  • Used Hadoop - HIVE to fit the complete data and HIVE queries to perform Data Munging.
  • Delivered Internal Training classes on Big Data Hadoop, Spark to ramp up the teams.
  • Established partnerships with top Big Data vendors, DataStax/Cassandra, DataBricks/Spark and 3 Hadoop distributions.
  • Gained firm understanding of Hadoop architecture, which involves data processing in various nodes.
  • Worked on Hadoop cluster and data querying tools Hive to store and retrieve data.
  • Worked on industry specific examples of Big Data using Map Reduce and Hadoop.
  • Partnered with ETL team to extract data from Hadoop environment.
  • Used ClouderaHadoop YARN to perform analytics on data in Hive.
  • Experience with SQL and T-SQL/Hadoop database.
  • Propose Models for Personalized Consumer Purchase Recommendation Data Processing and Feature Extraction on Hadoop
  • Developed a predictive model on driver behavior based on the journey details and integrated with Hadoop Stream Analytics.
  • Extract data from Hadoop, Teradata, SQL server, Oracle database and others.
  • Build with Maven POM and Jenkins scripting working with Hadoop/HBase.
  • Hive, Shell Scripting, Hadoop).
  • Machine learning and predictive modeling Statistical analysis Big Data techniques - MapReduce/Hadoop, Pig, Spark, etc.
  • Improved overall-speed by 30% by parallelizing Hadoop jobs using Apache Oozie.
  • Used) Sklearn, NumPy, Hadoop, Hive, memoizing.

Show More

745 Hadoop Jobs

No jobs at selected location

8. Big Data
demand arrow
high Demand
Here's how Big Data is used in Data Scientist jobs:
  • Examined benefits information through big data; analyzed approximately two billion records.
  • Worked on unsupervised machine learning algorithms for Big Data analysis.
  • Designed Big data system using appropriate strategies based on data formats, data ingest rate, cross database migration.
  • Promoted within to a Data Scientist role to focus on strategic initiatives and interpret big data into business insights.
  • Leveraged Tableau, SQL, Big Data, R, and Java to analyze and make proposals.
  • Open source project to build a Bayesian Markov Chain Monte Carlo solver for big data using Spark.
  • Performed big data analysis and studied collection models for the defaulted customer accounts using Markov models.
  • Created data scaled architecture in developing the tool on big data environment using.
  • Presented Are you asking the right questions of Big Data.
  • Experience with Big data and handling huge Data Sets.
  • Educate and train fellow and future Big Data Scientists.
  • Develop methodologies to streamline workflow processes to implement and operate both technical and non-technical Big Data technologies and techniques.
  • Provided consulting, strategic advice and insights in Data Strategy, Big data analytics for a major banking and telecommunication client.
  • Accelerate and supports the ongoing activities in the field of Big Data and Smart Analytics at the innovation lab.
  • Used Google Analytics big data and K means clustering to group unknown data to develop a recommendation engine.
  • Focused to design and deploy predictive models developed using Data Science techniques for Big Data and Hadoop Ecosystem.
  • Work with different business units to understand the business demands with respect to Big Data and Analytics.
  • Prioritized business cases using metrics and help the CIO invest in big data domain based on ROIs.
  • Multithreaded parallel computing optimized for Big Data analytics on Spark.
  • Provide technical consulting on machine learning and analytics to bring new PARC big data analytics technology to market.

Show More

3,586 Big Data Jobs

No jobs at selected location

9. Web Application
demand arrow
high Demand
Here's how Web Application is used in Data Scientist jobs:
  • Developed Web Applications for customers in different industries.
  • Develop back-end web application, such as query, data visualization, and real time data process for data service.
  • Led efforts to build a cloud-scale web application to enable end-users to interact with clustering results.
  • Created and hosted a web application to interact with the model using REST calls.
  • Create client specific web applications for real time political trend reporting and predicting.
  • Design and build user-interactive web application for engineers' daily practice.
  • Developed SVM NLP classifier into a Python-Django based web application.
  • Create MVP for startup: web application using genetic algorithms for text optimization (ad texts, translations, etc.)
  • Developed an iOS and web application to assist doctors prescribing dialysis and allow nurses easy access to the prescriptions.
  • Worked on backend for a web application that served a data stream.
  • Programmed web application as a subcontractor for Alvarez & Marsal LLC.
  • Used SAX and DOM parsers to parse the raw XML documents Used RAD as Development IDE for web applications.
  • Key Accomplishments: Implemented R, created company wide policies and procedures - including the creation of Rstudio Shiny web applications.

Show More

18 Web Application Jobs

No jobs at selected location

10. SAS
demand arrow
high Demand
Here's how SAS is used in Data Scientist jobs:
  • Developed and performance tuned various SAS modules hosting reporting applications.
  • Analyze business problems and design Statistical models using Regression and Machine Learning, using SAS, R, and H2O.
  • Solved and explained discrepancies in financial health insurance data using SAS software and applying data management techniques.
  • Developed Predictive Clinical Score Index for severe morbidity after coronary artery surgery (SAS).
  • Utilize SAS and Toad systems to harvest and analyze data through statistical models and reports.
  • Created complex reports utilizing SAS, Microsoft Word, Microsoft Excel, and R studio.
  • Edited raw data and created SAS data sets for statistical analysis for project/business decisions.
  • Developed statistical analysis subsystem DBSAS with 28 statistical procedures for the HP DBMS Image/1000.
  • Implemented Preoperative Clinical Severity Score Index for coronary artery bypass patients (SAS).
  • Improved efficiency by 98% with the use of SAS for data processing.
  • Coded data input, cleansing and verification modules (SAS, SQL).
  • Construct statistic models using R and SAS i.e.
  • Created data extraction SAS codes from different sources.
  • Worked on applying security constraints for SSAS models.
  • Developed quick models using SAS and SPSS.
  • Used predictive modeling with tools in SAS, SPSS, R, Python.
  • Used SQL (an SAS language), Visual Basic, and Splunk to analyze customer data
  • Created and modified new and existing SAS programs; and producing Ad hoc reports
  • Utilized skills in software applications such as SAS Enterprise Miner/Guide and R/Excel/JMPIn.
  • Utilized Vertica RDBMS, and SAS / R logistic regression to develop and operationalize lead scoring and opportunity models.

Show More

511 SAS Jobs

No jobs at selected location

11. Data Visualization
demand arrow
high Demand
Here's how Data Visualization is used in Data Scientist jobs:
  • Implemented clinical reporting programs that were utilized by both clinical and data management teams which aided in data visualization and reporting.
  • Provided clients with high quality data visualizations including static plots, animations, and interactive Shiny applications.
  • Provided training for executives on statistical analysis, Tableau, data visualization and storytelling best practices.
  • Specialized in developing intuitive user interfaces and data visualization tools within a team environment.
  • Create reports by using data visualization on Tableau to provide strategy recommendations.
  • Translate all data to statistical reports with data visualizations.
  • Build web interface for data visualization and analysis
  • Deployed data-driven tools to enable a large restaurant chain reduce food-waste by 15% through demand forecasting and data visualizations.
  • Applied Large scale and low latency Machine learning to Non-parametric models, Fraud detection models, High dimensional data visualization.
  • Created minimum viable survey data visualization dashboard with MongoDB, R, and Shiny package
  • Performed data exploration, preparation and data visualization with R and SAS 2.
  • Used R Shiny and ggplot2 for data visualization.
  • Perform an EDA and data Visualization using Tableau.
  • Developed data visualization materials and packaged communications on data analytics for presentation to technical and non-technical audiences.
  • Introduced new data visualization and statistical analysis tools in our analytics framework.
  • Work with various data analysis programs for statistical deep dive analytics and data visualization using programs such as QlikView.
  • Performed data visualization with Tableau and D3.js, and generated dashboards to present the findings.
  • Performed data visualization on the front end by using SAP Lumira.
  • Created and implemented automated data visualizations for tracking organizational KPIs by creating interactive visualizations and dashboards using Tableau and Microstrategy.
  • Implemented an interactive data visualization and narrative with what-if capabilities for the Family Planning team.

Show More

844 Data Visualization Jobs

No jobs at selected location

12. Data Science
demand arrow
high Demand
Here's how Data Science is used in Data Scientist jobs:
  • Consult clients in Data science/statistical consultation Database development/design Machine learning (Classification, Regression etc.)
  • Led/managed software engineering and data science teams (ranging from 3-8 scientists and engineers).
  • Helped promote data science within R&D and all BV products.
  • Implement data science and engineering techniques to bring insight to difficult data.
  • Lead efforts to implement Big Data and Data Science practices.
  • Supervised interns in data science and data engineering projects.
  • Specialized in Data Science and Machine Learning.
  • Collaborated with marketing and technical professionals across organizations to translate business requirements into data science questions and actions.
  • Collaborated with several multidisciplinary groups of my peers to participate in Kaggle data science competitions.
  • Led competitor benchmarking study for data science implementation.
  • Ranked numerous data science books using self-implemented Massey's, Colley's and Markov Chain Method (AKA PageRank).
  • Performed data science and engineering for keyword bidding and assorted retail marketing needs, as needed.
  • Use R/Java/Python/C++ to write data science algorithms Manage data science team to devise value added projects
  • Create analytics and data science infrastructure from scratch, including all required tools and applications.
  • Provide analytical and statistical support, design experiments, and build data science products.
  • Studied and presented several tools helpful in data science (eg., Docker).
  • Performed side projects using data science, analysis, predictive analytics, forecasting, optimization, research, and statistical mythbusting.
  • Tasked with heading up initiative to employ data science and statistical modeling techniques within the eBusiness analytics team.
  • Presented findings to News Corp data science team, product owners, vendors, and local meetups.
  • Studied data science techniques and principles under experienced industry professionals in an immersive 12-week program.

Show More

2,153 Data Science Jobs

No jobs at selected location

13. Predictive Models
demand arrow
high Demand
Here's how Predictive Models is used in Data Scientist jobs:
  • Developed explanatory/ predictive models using independent variables (manufacturing process variables, raw material attributes) to predict critical parameters.
  • Researched methods to improve statistical inferences of variables across models and developed statistical, mathematical and predictive models.
  • Designed multiple research studies, developed targeted profiles through quantitative methods including predictive models.
  • Designed and Developed online advertising predictive models for JenJo's proprietary trading platform.
  • Skilled with using R to build predictive models that update continuously, mine historical data, and predict future outcomes.
  • Worked on multiple predictive models to predict future electricity and gas bills based on the current usage patterns.
  • Created predictive models to increase the revenue of the company by studying the demographics and usage patterns.
  • Developed predictive models using Logistic regression, Decision Tree, Random Forest and KNN algorithms.
  • Implemented predictive models in python to improve click-through rate for an email marketing campaign.
  • Developed predictive models using regression, C5.0, decision lists, and decision trees.
  • Developed predictive models using Decision Tree, Random Forest and Na ve Bayes.
  • Engineered new features for predictive models to improve the cross-device attribution graph.
  • Created predictive models using Bayesian and machine learning techniques.
  • Developed analytics and predictive models to quantify vehicle fuel consumption.
  • Project Description: This project is about extracting data from various data sources and performing preprocessing operations and building predictive models.
  • Develop and implement predictive models in R and Python, produce data visualizations and online reports using JavaScript and Python.
  • Build predictive models dealing with very granular data stored in Hadoop/ Spark and other big data platforms.
  • Created predictive models with GE Transportation to identity factors that impact fuel efficiency of locomotives Presented findings at the Analytics Exchange conference
  • Trained, evaluated, and analyzed predictive models on multiple large administrative datasets using Machine Learning algorithms and regression techniques.
  • Developed end user applications and predictive models to better understand the the firm's SEO data

Show More

578 Predictive Models Jobs

No jobs at selected location

14. Logistic Regression
demand arrow
high Demand
Here's how Logistic Regression is used in Data Scientist jobs:
  • Developed machine learning models utilizing Logistic Regression and Random Forests to identify human characteristics and behavior to derive striking insights.
  • Designed and developed data wrangling and visualization techniques as well as a classification engine based on Logistic Regression.
  • Developed a multiple logistic regression prediction model to evaluate quality of mortgages.
  • Experience with supply chain management analyzing data with logistic regression model.
  • Image suitability before downloading is determined using logistic regression.
  • Implemented logistic regression algorithm and measured the model accuracy.
  • Performed logistic regression to evaluate probability of beneficiaries to buy new waste management services in the DR.
  • Developed a logistic regression predictive model that estimates the probability of the future default of loan applicants.
  • Worked on statistical models like Regression, Logistic Regression, SVM, Linear Models and Random Forests.
  • Performed logistic regression within each cluster on other predictors like image size, name, caption 4.
  • Used R to pull large-scale data and help build logistic regressions to predict customer churn rate.
  • Sampled data and developed the predictive model using logistic regression in SAS.
  • Used Decision tree CART and Logistic Regression to identify the loan defaulters.
  • Constructed, fitted and diagnosed a logistic regression model for cancer data.
  • Used Logistic Regression to obtain the probabilities for non-defaulters and defaulters.
  • Performed analyses using classification trees, principle component analysis, and logistic regression.
  • Used logistic regression (PORC LOGISTIC), clustering (PROC CLUSTER) and multivariate modeling to provide valuable analytical insights.
  • Developed a Logistic Regression based filter that uses merchant transactions data to block false positive merchant alerts.
  • Trained a model for multi label data, implemented logistic regression classifier on 34gb of data using apache spark.
  • Implemented a proximal stochastic gradient descent with a line search to fit a regularized logistic regression in Scala

Show More

226 Logistic Regression Jobs

No jobs at selected location

15. ETL
demand arrow
average Demand
Here's how ETL is used in Data Scientist jobs:
  • Utilize SAS programming software and ETL techniques to manage and maintain U.S. foreign assistance and trade capacity building databases for USAID.
  • Optimize and automate ETL for Attribution, which eliminates 85% manual work and saves 50% database storage.
  • Lead the transition from a single monolithic ETL process to more modular processes using Spark ML data pipelines.
  • Interfaced with large scale database system through an ETL server for data extraction and preparation.
  • Fixed broken ETL processes, reducing production system downtime from 2 hours daily to zero.
  • Developed an ETL pipeline to predict soil properties across the entire United States.
  • Perform data-extraction, transformation and loading (ETL) development and programming.
  • Coded and deployed this ETL pipeline to be run in batch mode.
  • Evaluate cloud based ETL/ELT tools and perform a pilot on AWS.
  • Ensured each ETL process had a corresponding Luigi job.
  • Developed an ETL pipeline for this project.
  • Created NetLogo model to traffic flow at a toll booth to analyze impact of various toll booth configurations.
  • Prepare data model & ETL framework to integrate SEN data sources, Healthcheck and Alerts from equipment.
  • Established ETL (& ELT) using Pentaho; maintained several Tableau dashboards.
  • Performed ETL on health insurance claims data, using Python and Postgres SQL.
  • Created Hive views and exposed to ETL tool through Oozie engine.
  • Developed custom Conda packages of proprietary ETL processes for these deploys.
  • Designed and implemented data ETL pipeline Python/Impala/httpfs/Luigi/Parquet for RTB analytics.
  • Generated high level report of purchase for the executives using input from SAP Hana & ETLs.
  • Used Informatica PowerCenter for ETL, IBM Cognos and Tableau for reporting.

Show More

383 ETL Jobs

No jobs at selected location

16. SQL
demand arrow
average Demand
Here's how SQL is used in Data Scientist jobs:
  • Involved in performing extensive Testing by writing T-SQL queries and stored procedures to extract the data from Database.
  • Retrieved and modified data using SQL Server 2008 and BIDS and acquired disease information from different databases.
  • Analyzed large amounts of data using statistical techniques and implemented through SQL queries to derive actionable insights.
  • Generated periodical reports using SQL and transformed large amount of data as per the business requirement.
  • Write complex SQL queries to create views and obtain data for reporting in SSRS.
  • Trained and coached 32 business users Power/Casual in Tableau and SQL visualization techniques.
  • Utilized SQL server's reporting services, SSRS, to support reporting requirements.
  • Generated exploratory analysis using R, SQL and create a prediction model.
  • Pulled and manipulated data from SQL databases for presentation to customers.
  • Develop GraphSQL's Next Generation Graph Query Language.
  • Acquired current data by TOAD client using SQL queries
  • Utilized SQL to access and organize data.
  • Implemented social network app in NoSQL Cassandra.
  • Experience with Amazon EC2, Apache, PHP and MySQL.
  • Transitioned data stored on Mainframe Systems (IBM HOD) to MS SQL Server for centralized storage and processing.
  • Job Responsibilities Gathered the customer demographics, call center & billing data from the corporate databases using SQL.
  • Mapped driver geolocations to roads for analysis using PostGIS and PostgreSQL.
  • Designed merging solution of the acquired companies data in SQL.
  • Worked on R, Tableau, SQL, XL miner and familiar with @Risk.
  • Build a recommender systems product from scratch through GraphSQL's graph analytics platform.

Show More

2,467 SQL Jobs

No jobs at selected location

17. Support Vector Machines
demand arrow
average Demand
Here's how Support Vector Machines is used in Data Scientist jobs:
  • Machine learning, statistics, survival analysis, support vector machines, neural nets, graph theory.
  • Used Support vector machines for classification of data in groups.
  • Gained familiarity with K-means clustering and support vector machines.
  • Utilized Support Vector Machines, Logistic/Linear Regression, Nearest Neighbors, Naive Bayes Classifiers.

Show More

66 Support Vector Machines Jobs

No jobs at selected location

18. Hdfs
demand arrow
average Demand
Here's how Hdfs is used in Data Scientist jobs:
  • Worked on importing data from various sources and performed transformations using Map Reduce, Hive to load data into HDFS.
  • Worked on Linux shell scripts for business process and loading data from different interfaces to HDFS.
  • Extracted data from HDFS and prepared data for exploratory analysis using data munging.
  • Performed various Unix operations to import data to HDFS and run Map-Reduce jobs.
  • Worked with developers to extract data from HDFS to Spark shell for analysis.
  • Interfaced with Data Architecture teams using HDFS distributed systems.
  • Involved in extracting data from source to HDFS.
  • Dumped (ETL) to HDFS.
  • Installed and configured Hadoop, MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and processing.
  • Cleaned and transformed these web data from json file to data frame, and loaded into Hive table in HDFS.
  • Experience with configuration, management of Hadoop, HDFS clusters and MapReduce based algorithm development under UNIX/LINUX environment.
  • Have built and configured HDFS with User-level and Job-level disk storage, User Security using Kerberos, SSL.
  • Downloaded images are stored in HDFS, metadata about the images are stored in Sql Server.
  • Applied Sqoop and Flume for data transaction from HDFS to Relational Database.
  • Performed Exploratory Data Analysis using R and Hive on Hadoop HDFS.
  • Loaded unstructured data into Hadoop File System (HDFS).
  • Configured MR/Sqoop jobs to import data from RDBMS into HDFS.
  • Applied Apache HiveQL to analyze weblog data collected and stored in Hadoop HDFS.
  • Worked on Data Ingestion using Sqoop from Oracle to Hdfs.
  • Compared performances of ingestion between from spark to memsql, spark to HDFS parquet and spark to ignite.

Show More

70 Hdfs Jobs

No jobs at selected location

19. Natural Language Processing
demand arrow
average Demand
Here's how Natural Language Processing is used in Data Scientist jobs:
  • Worked on Natural Language Processing with NLTK module of python for application development for automated customer response.
  • Use natural language processing and topic clustering analysis with Latent Dirichlet Allocation and Non- negative Matrix Factorization.
  • Developed internal natural language processing software to analyze product sentiment trends from these mentions.
  • Positioned as team expert in Natural Language Processing and Market Segmentation projects.
  • Design algorithms and systems for natural language processing (NLP) based feature extraction from social media postings.
  • Created word cloud graphics using R and natural language processing (NLP) to provide insight.
  • Apply natural language processing to interpret legal rules using Text Mining and Data Mining methods.
  • Performed natural language processing on users' reviews, incorporated with enhance star rating average.
  • Focused on issues around data science and especially natural language processing at scale.
  • Applied some basic natural language processing techniques to process social media data.
  • Machine Learning, Natural Language Processing, Data Mining and Data Visualization.
  • Create speech recognition and natural language processing components of the product.
  • Develop natural language processing solutions (named entity recognition, keyword extraction).
  • Used standard natural language processing techniques, graph theory, and linear algebra to cluster, compare, and analyze films.
  • Conducted Natural Language Processing and machine learning analysis (Bayesian, logistic, SVM, Random Forest..etc).
  • Worked on a POC project for NLP(Natural Language processing) with Lowes Review Data.
  • Assisted in teaching the Applied Natural Language Processing course at gU.
  • Architected 2000 term cyberwarfare ontology and tripled technology ontology to 9000 entities to drive natural language processing analysis.
  • Created a measure of clinical distance between patients by applying natural language processing techniques (word2vec) to diagnosis codes.
  • Applied natural language processing to the Multi-part diagnosis text for Diagnosis classification (such as Benign/Malignant) 2.

Show More

383 Natural Language Processing Jobs

No jobs at selected location

20. Neural Networks
demand arrow
average Demand
Here's how Neural Networks is used in Data Scientist jobs:
  • Included decision trees, support vector machines, genetic programming, neural networks, distance correlation and mixture models.
  • Used Supervised learning techniques such as classifiers and neural networks to identify patters in these data sets.
  • Develop and modify Neural Networks and machine learning for pattern and image recognition.
  • Developed diagnostic tests for neural networks used in prediction.
  • Created web app that utilized convolutional neural networks to identify actors in user supplied images.
  • Implemented supervised learning algorithms such as Neural networks, SVM, Decision trees and Na ve Bayes for advanced text analytics.
  • Research Neural Networks is involved in examining the crime problem in Chicago by looking through public data sets.
  • Introduced Decision Trees, Neural Networks, LASSO & Quantile Regression techniques to the client targeting process.
  • Applied neural networks models (RNN, Feedforward, Auto encoder) on multiple projects.
  • Implemented .Net MicroService technology to allow various neural networks to share/co-evolve information faster and more intelligently.4.
  • Advanced Text analytics using Deep learning techniques such as convolutional neural networks to determine the sentiment of texts.
  • Experienced in Artificial Neural Networks(ANN) and Deep Learning models using Theano, Tensorflow and keras packages using Python.

Show More

421 Neural Networks Jobs

No jobs at selected location

21. Pandas
demand arrow
average Demand
Here's how Pandas is used in Data Scientist jobs:
  • Performed data modeling operations using Power Bi, Pandas, and SQL.
  • Analyzed NYC Subway Turnstile data for traffic patterns using pandas.
  • Utilized tools including Tableau, Pandas, SQL, Flask, JS, D3, MongoDB, and AWS.
  • Used Python libraries (like Pandas and Scikit-learn) to import, clean and analyze data, and Tableau for visualization.
  • Automated data reporting using Python (Pandas), R, and Redshift (SQL).
  • Predicted audience scores on Rotten Tomatoes using BeautifulSoup and Python (pandas, scikit-learn).
  • Perform EDA using Sql and Pandas.
  • Utilized data science packages such as scikit learn, pandas and jupyter to optimize student accessibility.
  • Scraped data aquired using BeautifulSoup and Selenium and analyzed using the pandas module.
  • Developed combination of Extractive and Abstractive summarizer using python packages (Pandas, Beautiful Soup, etc.)
  • Experienced in Python scikit-learn, Pandas, R, SAS, Teradata, Skytree, and Hadoop.

Show More

22. K-Means
demand arrow
average Demand
Here's how K-Means is used in Data Scientist jobs:
  • Clustered retailers using k-means algorithm and identified different sales drivers for different groups of retailers.
  • Leveraged k-means to optimize media costs and improve advertising budget management system.
  • Implemented Support Vector Machine (SVM), Logistic regression model, K-means clustering for predictive analysis using python Scikit-Learn 0.18.
  • Used R to develop k-means, random forest and decision tree models to classify the members.
  • Created a recommendation system using k-means clustering, NLP and Flask to identify the potential customers.
  • Used R to do K-means clustering over the metric defined by latitude and longitude 3.
  • Utilized clustering method of K-means clustering to categorized patients into groups via packages in Spark.
  • Used k-means clustering for profiling customers, location, product, season, etc.
  • Applied k-means and hierarchical clustering (using R) on the above data.
  • Worked with GMM and K-means algorithms to perform clustering on sensors measurements.
  • Applied Clustering Algorithms such as K-Means to categorize customers into certain groups.
  • Created clusters for customer type and vehicle type using k-means clustering.
  • Use cluster analysis (k-means) for market segmentation.
  • Segmented the customers based on demographics using K-means Clustering.
  • Implemented public segmentation using unsupervised machine learning algorithms by implementing k-means algorithm using Pyspark.
  • Used the Scikit-learn k-means algorithm to cluster news articles for the different state banking holidays together.
  • Use CHAID, Apriori, K-Means, SVM Classification algorithm for prediction of opportunities
  • Used clustering technique K-Means to identify outliers and to classify unlabeled data.
  • Applied clustering algorithms i.e.Hierarchical, K-means with help of Scikit and Scipy.
  • Used recency, frequency and monetary (RFM) indices to perform customer-segmentation using k-means and k-medoids clustering.

Show More

23. AWS
demand arrow
average Demand
Here's how AWS is used in Data Scientist jobs:
  • Installed, configured and maintained DNS systems using Route53 (AWS) and used JENKINS for continuous integration and continuous delivery.
  • Conduct regular training on how to use AWS, the command line, and python for data science.
  • Developed, tested, and fielded the supporting database and web application on an Amazon AWS LAMP stack.
  • Used AWS EC2 and RDS instances to scale across hundreds of virtual machines.
  • Developed platform requirements, designed prototype by using AWS.
  • Extracted data from various web page interfaces via AWS.
  • Used Amazon AWS for cloud computing.
  • Experience in Amazon EC2 Skills Used Python, R, SQL, Java AWS
  • Designed the schema, configured and deployed AWS Redshift for optimal storage and fast retrieval of data.
  • Determined new correlations between business categories in the YELP dataset using SAP HANA on AWS Marketplace.
  • Worked on AWS S3 buckets and intra cluster file transfer between PNDA and s3 securely.
  • Deployed on AWS spot instances and persisted data to Postgresql on RDS.
  • Perform gap analysis and proposing a scalable architecture on AWS.
  • Created and administered Google and AWS Hybrid Cloud infrastructure.
  • Gained proficiency in Python, SQL, d3, and AWS by working on 5 end to end projects.
  • Configured and administrated Hadoop/Zookeeper/EC2 compute clusters on AWS/Cloudera.
  • Create ETL pipelines on AWS, load data into Redshift, create visualizations in Periscope Data using SQL queries.
  • Developed MapReduce/Spark Python modules for machine learning & predictive analytics in Hadoop on AWS.
  • Spark, Scala, deep learning for j, Cloudera, EMR, AWS.
  • Cloud Systems: Aws, Hadoop, Map-Reduce, Debian VMs.

Show More

425 AWS Jobs

No jobs at selected location

24. Data Quality
demand arrow
average Demand
Here's how Data Quality is used in Data Scientist jobs:
  • Identified patterns, data quality issues, and opportunities and leveraged insights by communicating opportunities with business partners.
  • Established processes and methods for continuous data quality improvement of production software system.
  • Lead and implemented data quality solutions.
  • Accelerated data quality to 90%, slashed reporting time by 87%, and reduced costs by more than 135K.
  • Analyzed, verified, and modified UNIX, SAS, and Python scripts to improve data quality and performance.
  • Identified patterns, data quality issues, and opportunities that led to high recognition from Sales Dept.
  • Develop triggers to identify potential data quality issues cross the AMEX network in Big Data environment.
  • Participated in continuous interaction with Marketing and Finance teams for obtaining the data and data quality.
  • Implemented ETL data streams from end-to-end, delivered data points and checked data quality regularly.
  • Analyzed, verified, and modified existing SQL queries to improve data quality and performance.
  • Checked data distribution, missing rate and other data quality properties for each vendor data.
  • Managed Data quality & integrity using skills in Data Warehousing, Databases & ETL.
  • Provided data quality metrics and feedback to clinical monitors and study site personnel.
  • Engaged clients on data quality issues and helped correct their submissions retrospectively.
  • Investigate new data sources and identify data quality issues and potential value
  • Identified patterns, data quality issues, and opportunities.
  • Developed a data quality checking tool in Python.
  • Provided ad-hoc analysis on data quality, experiment design, and business questions.
  • Performed data quality check, Univariate and Bivariate analysis to validate the hypothesis.
  • Led the in-stream data trend analysis that successfully supported the data quality and integrity of the Ofatumumab database.

Show More

420 Data Quality Jobs

No jobs at selected location

25. Scikit-Learn
demand arrow
average Demand
Here's how Scikit-Learn is used in Data Scientist jobs:
  • Performed data imputation using Scikit-learn package in Python.
  • Improved fraud prediction performance by using random forest and gradient boosting for feature selection with Python Scikit-learn.
  • Participated in features engineering such as feature creating, feature scaling and One-Hot encoding with Scikit-learn.
  • Classified trade journal and newspaper articles using Scikit-learn's k-nearest neighbor algorithm.
  • Used Python's scikit-learn library to understand details of client telemarketing campaign.
  • Participated in features engineering such as feature intersection generating, feature normalize and label encoding with Scikit-learn preprocessing.
  • Scaled the link prediction experimentation from a specific decision tree classifier to any classifier within Spark ML/MLlib and scikit-learn.
  • Developed in python using gensim and scikit-learn libraries and deployed the application using docker.
  • Build recommendations/predictions platform using python, django, postgres, scikit-learn.

Show More

26. Numpy
demand arrow
average Demand
Here's how Numpy is used in Data Scientist jobs:
  • Developed Python code for data analysis (also using NumPy and SciPy), Curve-fitting.
  • Developed scripts in Python (Pandas, Numpy) for data ingestion, analyzing and data cleaning.
  • Performed Data Cleaning, features scaling, features engineering using pandas and numpy packages in python.
  • Worked on data cleaning and ensured data quality, consistency, integrity using Pandas, Numpy.
  • Experience in Data wrangling tasks on Insurance data using python libraries Numpy and pandas.
  • Performed data processing using Python libraries like Numpy and Pandas.
  • Cleaned data using numpy and pandas.
  • Performed exploratory data analysis like statistical calculation, data cleaning and data visualizations using Numpy, Pandas and Matplotlib.
  • Generated graphical reports using python package Numpy and matPlotLib.
  • Developed machine learning models for health insurance claims data, using R and python libraries like numpy, pandas, scilearn.
  • Implemented the customer choice model described in the paper Restricted Boltzmann Machines Modeling Human Choice using Theano and NumPy.
  • Work on outliers identification with box-plot, K-means clustering using Pandas, NumPy.
  • Used Pandas, numpy, matplotlib, Tableau for data analysis
  • Architected and prototyped item recommendation system using python, scikit-learn, numpy
  • Created analytical reports for CEO using SQL, python (pandas, 2014 numpy, scipy, scikit-learn).
  • Utilized Python libraries wxPython, numPY, Twisted and matPlotLib Used python libraries like Beautiful Soup and matplotlib.
  • Used pandas, numpy, seaborn, scipy, matplotlib, scikit-learn in Python for developing various machine learning algorithms.

Show More

27. Teradata
demand arrow
average Demand
Here's how Teradata is used in Data Scientist jobs:
  • Acted as DBA for the cluster and trained subordinates on how to plan and architect databases and useTeradata Aster Database software.
  • Developed Python programs for manipulating the data reading from various Teradata, update the Content in the database tables.
  • Worked with BTEQ to submit SQL statements, import and export data, and generate reports in Teradata.
  • Created multiple custom SQL queries in Teradata SQL Workbench to prepare the right data sets for Tableau dashboards.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Teradata, SQL Server.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
  • Developed SQL queries to extract data from a Teradata database to provide reports to clients of Precision.
  • Utilized SPSS, SAS, SAS Enterprise, and Teradata to perform analyses on millions of customers.
  • Converted programs from Teradata SQL to SAS to automate the monthly production processes.
  • Developed SAS/SQL and Teradata/SQL queries for automation of reports in SAS and SSRS.
  • Prepared SQL scripts for ODBC and Teradata servers for analysis and modeling.
  • Installed, upgraded, and maintained the Teradata Aster Database software.
  • Used Teradata utilities such as Fast Export, Multi LOAD for handling various tasks.
  • Worked on TeradataSQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and FastExport.
  • Created interactive dashboards in Tableau using Tableau Desktop pulling the data from Hadoop, SQL Server, and Teradata.
  • Developed End-to-End workflow to integrate the ELT/ETL from Source Systems to Hadoop and Teradata using Informatica BDE.
  • Perform tuning of Redshift database to match with performance of "on premise" Teradata implementation.
  • Build advance analytics Solutions on Big Data Hybrid platform- Hadoop (Hortonworks) Data Lake and Teradata Aster Cluster.
  • Developed "Guide to Using Informatica Power Center in a Teradata Aster nCluster Environment."
  • Managed data using DB2 and Teradata; modeled conditionalities environment of Bolsa Fam lia.

Show More

64 Teradata Jobs

No jobs at selected location

28. Mapreduce
demand arrow
average Demand
Here's how Mapreduce is used in Data Scientist jobs:
  • Developed MapReduce pipeline for feature extraction using Hive.
  • Handled importing data from various data sources, performed transformations using MapReduce, Hive and loaded data into HDFS.
  • Utilized Hive and MapReduce to process and link millions of rows of data from multiple data sources.
  • Implemented Theta Join from structural database for large scale tables through MapReduce programming model utilizing Java.
  • Used Maven extensively for building jar files of MapReduce programs and deployed to Cluster.
  • Experience in writing MapReduce programs with Java API to cleanse Structured and unstructured data.
  • Implemented MapReduce jobs to clean and wrangle customer data based on client-specific rules.
  • Worked on machine learning on large size data using Spark and MapReduce.
  • Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Tested MapReduce jobs with MRUnit and scheduled works in OOZIE.
  • Supported MapReduce Programs running on the Hadoop cluster.
  • Created and ran daily jobs in MapReduce, [ ] Pig, Hive, etc.
  • Used R and MapReduce to conduct data analytics Conducted data modeling in Cassandra.
  • Participated in design and implementation of company data systems migration to distributed processing environment experience in Hadoop Hive/Pig Redshift and MapReduce.
  • Used Python scripts to update the content in database and manipulate files.Environment: Hadoop , MapReduce , Python

Show More

342 Mapreduce Jobs

No jobs at selected location

29. Mongodb
demand arrow
average Demand
Here's how Mongodb is used in Data Scientist jobs:
  • Collected unstructured data from MongoDB and completed data aggregation.
  • Created data acquisition and cleaning protocol using the newest technology including MongoDB, Python, R, and AWS
  • Utilized MongoDB to access and organize data https://github.com/Hbl15/Beauty-Product-Recommendation-Engine App website: ?http://beauty3.ninja/.
  • Collected 5 million customer records from MongoDB and MySQL.
  • Worked on NOSQL databases such as MongoDB and Cassandra.
  • Experience with NoSQL databases such as MongoDB.
  • Worked on NOSQL databases like MongoDB.
  • Developed Hadoop to MongoDB integration.
  • Created geocoded fire incidents data warehouse based on MySQL and MongoDB.
  • Deployed on the web using Flask, jQuery, Bootstrap, and Plot.ly from stored data in a MongoDB database.

Show More

401 Mongodb Jobs

No jobs at selected location

30. Large Data
demand arrow
low Demand
Here's how Large Data is used in Data Scientist jobs:
  • Used data mining and non-parametric methods to analyze very large data sets (over 5,000 rows) using R..
  • Navigated large data sets from a variety of sources and compiled them into a centralized database.
  • Build automated data conditioning tools for use on large data sets.
  • Developed large data sets from structured and unstructured data.
  • Value added modeling Manage Large Datasets
  • Research machine learning algorithms and implement by tailoring to particular business needs and tested on large datasets.
  • Advised business partners regarding best practices on how to utilize large datasets to make better data-driven decisions.
  • Managed simultaneous projects, large data sets and strict timelines and communicated results to the management.
  • Developed SAS codes for creating new variables, data cleaning and transformation of large datasets.
  • Established automated processes for transforming and cleaning large datasets using SQL and Python.
  • Managed data science projects to derive meaningful insight from large datasets.
  • Analyze large datasets to provide strategic direction to the company.
  • Carry out various statistical analyses of large data sets.
  • Implemented Cassandra cluster to store large data.
  • Worked on large datasets, transformed information from raw data into meaningful analysis that identifiestrends and predicts outcomes.
  • Distributed and multi-threaded development for efficient processing of large data sets End-to-end product development and integration.
  • Developed a data governance framework for the creation and storage of large datasets, including internal data and Mixpanel events.
  • Utilized sparklyR, H2O packages for High Performance Computing, to reduce the time to process large dataset.
  • Created large datasets by combining individual datasets using various inner and outer joins in SAS/SQL and dataset merging techniques of SAS/BASE.
  • Machine Learning -Price Incrementality -Regression Models -Large Data in SQL -R Programming

Show More

6 Large Data Jobs

No jobs at selected location

31. Decision Trees
demand arrow
low Demand
Here's how Decision Trees is used in Data Scientist jobs:
  • Implemented Decision trees instead of Cluster analysis to identify conditions.
  • Developed audience extension models relying on decision trees, random forest, Support VectorRegression, and other continuous data.
  • Implemented Classification using Supervised algorithms like Logistic Regression, Decision trees, KNN, Naive Bayes.
  • Performed Market-Basket Analysis and implemented decision trees, random forests and K- fold cross validation.
  • Used Decision trees and Random forests to find employee attrition rate.
  • Used Logistic regression, LDA & random forest decision trees.
  • Developed audience extension models relying on decision trees, random forest, logistic regression, and othercategorical data
  • Used variety of analytical tools and techniques (regression, logistic, GLM, decision trees, machine learning etc.)
  • Used Regression Decision Trees, Neural network and Time series analysis.
  • Implemented ctree algorithm of Decision trees and Random Forest.
  • Implemented user segmentation using Decision Trees and K-means Clustering, prototyped dynamic visualization of clustering results in R shiny and Plotly.
  • Developed data pre-processing modules and rule extraction engines in R using Random Forest and Decision Trees for an analytics product.

Show More

214 Decision Trees Jobs

No jobs at selected location

32. API
demand arrow
low Demand
Here's how API is used in Data Scientist jobs:
  • Assisted business owners in developing marketing strategies to capitalize on the statistical analysis results.
  • Lead product design and user experience for data ingest and the python API components of the product.
  • Extracted data features by transforming, merging and reshaping data frames with python modules.
  • Web scraping/crawling to collect additional data from and Web Pages using different tools.
  • Project includes substantial data wrangling, web scraping and Google Books API interaction.
  • Experience with web scraping and content extraction packages in Python.
  • Performed as specialist on intelligent scraping and machine learning.
  • Conducted post audits of completed capital projects.
  • Worked on creating API s for scoring modules of various products using Flask micro-framework, Adprod tool and EC2 instances.
  • Evaluated and finalized API using F1-score performance metrics Shortlisted API was used to set up batch processing of interactions.
  • Conduct Trading Service, Capital Market, and Commercial Loan reporting visualization on a monthly basis using Spotfire.
  • Mined and extracted data from various website using scraping tools like Talend, Data Miner and more.
  • Implemented a cloud-hosted API framework to integrate R and Python models with the Ruby production backend.
  • Implemented scheme integration via REST API for all applications as micro-services 3.
  • Implemented APIs to reach wider customer base.
  • Used R and Google.api in a geolocation program to identify sites for time sensitive drug clinical trials.
  • Used Django and django-rest-framework to provide API access to reports.
  • Developed methods to enforce scraping etiquette, which supported multithreaded requests while protecting against significant utilization of server resources.
  • Design and implemented RESTful API, email notification, and secure https communication between server and Android.
  • Prototyped a Facebook Chatbot app using Facebook's Messenger API and node.js.

Show More

229 API Jobs

No jobs at selected location

33. Linear Regression
demand arrow
low Demand
Here's how Linear Regression is used in Data Scientist jobs:
  • Performed Lasso and Ridge Regularization on Logistic, Linear Regression Model.
  • Applied predictive modeling techniques such as linear regression, logistic regression, cluster analysis, decision trees and time series analysis.
  • Machine learning algorithms such as Linear Regression, Logistic Regression, Bayesian methods and unsupervised methods such as K-Means clustering algorithm.
  • Applied linear regression model to predict the time it would take for failed postings to be resolved and successfully posted.
  • Created a linear regression model that indicated IMBD rankings as an important factor in predicting movie gross.
  • Use MSE, RMSE, Absolute Mean Error to evaluate Linear Regression model on the other hand.
  • Direct mail model: applied logistic regression to direct mail response and linear regression to customer spend.
  • Utilized statistical methods in linear regression, sample survey, generalized linear models, etc.
  • Worked with Machine learning algorithms like Linear Regressions (linear, logistic etc.)
  • Fit Logistic Regression, Linear Regression Models on training data, using SciKit Learn.
  • Used linear regression, k-means clustering, and decision trees for modeling.
  • Used linear regression to predict ratings of movies based on books.
  • Applied linear regression on data and predicted the sales.
  • Developed Linear Regression to predict Actor Box Office Success.
  • Price perception & elasticity studies with Log linear regressions.
  • Used linear regression, ARMA, ARIMA, k-means, decision trees for modeling.
  • Performed cross-validation-test on linear regression model of data using scikit-learn.
  • Used various statistical analysis techniques-Hypothesis test, Linear and Multiple Linear Regression model, clustering to segment and predict the model.

Show More

6 Linear Regression Jobs

No jobs at selected location

34. Data Collection
demand arrow
low Demand
Here's how Data Collection is used in Data Scientist jobs:
  • Build and implement data collection and machine learning analysis strategy for online direct product marketing on mobile devices
  • Develop and implement data collection systems that optimize prediction and relevance accuracy in a recommendation system.
  • Collaborated with software engineer to optimize data collection efficiency and deploy prediction model on website.
  • Manage raw data collection, preparing and validating raw data for statistical analysis.
  • Performed extensive implicit as well as explicit data collection.
  • Develop and initiate more efficient data collection procedures.
  • Developed optimized data collection and qualifying procedures.
  • Participated in all phases of data mining; data collection, data cleaning, developing models, validation and visualization.
  • Understand all phases of the analytic process including data collection, preparation, modeling, evaluation, and deployment.
  • Developed original techniques for this technology, and implemented all necessary code, data collection, and database design.
  • Covered different tasks ranging from data collection, signal processing, to publication writing to meet deadlines.
  • Developed software for data collection, analysis and experiments automation control which saved the lab $20K.
  • Lead research team throughout data collection, experimental design, and results delivery stages.
  • Build the internal data collection platform for 200+ tenants which fuels product/design decisions.
  • Perform data collection, cleaning, imputing, stitching and balancing of data.
  • Advised on strategic data collection and datamining of network traffic flows to increase efficiency of operational bottlenecks.
  • Developed data collection specifications for contractedstudies.
  • Lead person for data collection and munging, creating database, performing all kinds of analytics, forecasting, and predictions.
  • Research designed on Qualtrics for data collection on student experience.

Show More

177 Data Collection Jobs

No jobs at selected location

35. BI
demand arrow
low Demand
Here's how BI is used in Data Scientist jobs:
  • Created dashboard design to maximize visibility of all resource used by different research projects that combines both financial and operational data.
  • Optimized markdown schedules and increased profitability of lower-performing products in cooperation with the merchandising team.
  • Conducted usability tests with quantitative and qualitative data to validate interface designs.
  • Put together 3 Big Data 3-day courses including all material (250p book and 200p slides) and tutorial.
  • Worked with BI team in data investigation, responsible for interpreting data variables, making instructions and data dictionaries.
  • Investigate new data and methodologies that may improve current marketing performance when used in combination with internal models.
  • Helped with identifying strengths and weaknesses of other BI tools for our new Customer BI tool initiative.
  • Worked with the organization s founders to educate them about the capabilities and limitations of predictive modeling.
  • Work on the Corporate Insights team which uses big data techniques to drive efficiency within the organization.
  • Developed a system for early detection of big stories using on-line content and social networks activity.
  • Co-designed algorithms optimizing price/demand elasticity relationships in the product chain, using a hybrid of graph-theoretic and probabilistic modeling approaches.
  • Implemented ROI impact for acquisition campaigns to determine overall campaign effectiveness and ability to compare results across multiple campaigns.
  • Used gathered data to pinpoint and categorize magnetospheric instabilities.
  • Implemented various techniques for addressing bias-variance trade-offs (e.g., cross- validation, regularization, dimensionality reduction, etc.).
  • Trained a team of junior data scientists and data engineers on Big Data techniques, Machine Learning and Analytics.
  • Managed activities on mobile app projects whose core is grounded on business tools for encryptions and password management.
  • Translate big data into actionable insight and communicate the results to customer service teams and product leadership.
  • Utilized various BI, data modeling and reporting tools to track engineering tasks supporting new product development through each project lifecycle.
  • Worked on combining Argus consortia to publically available datasets like taxi trips data from TLC and produce unique insights.
  • Lead genomic diagnostic test development * Explore and visualize genomic big data to find patterns and build model

Show More

757 BI Jobs

No jobs at selected location

36. Data Warehouse
demand arrow
low Demand
Here's how Data Warehouse is used in Data Scientist jobs:
  • Performed performance improvement of the existing Data warehouse applications to increase efficiency of the existing system.
  • Conducted one-to-one sessions with business users to gather data for Data Warehouse requirements.
  • Translated operational rules into ETL and data warehouse requirements.
  • Performed data analysis on the existing data warehouse.
  • Provided inputs to development team in performing extraction, transformation and load for data marts and data warehouses.
  • Created and optimized processes in the Data Warehouse to import retrieve and analyze data from the CyberLife database.
  • Worked as Big Data Architect and data engineer to ensure business value is achieved in 1.5M data warehouse.
  • Manage EFA data warehouse and develop data acquisition, ETL and data quality control framework and software.
  • Maintained SQL scripts to create and populate tables in data warehouse for daily reporting across departments.
  • Used T-SQL queries to pull the data from disparate systems and Data warehouse in different environments.
  • Managed data warehouse (DW) release deployment and maintenance in an Agile development environment.
  • Assist R enterprise implementation in the enterprise oracle data warehouse platform.
  • Implemented and maintained a data warehouse used by over 130 employees.
  • Build ETL scripts and provided Oracle Data Warehouse tables design.
  • Experience extracting from Data Warehouse using HIVE and Impala.
  • Optimized OLAP system for MSN's Data Warehouse.
  • Re-engineered data warehouse within 8 weeks.
  • Re-engineered healthcare data warehouse, increasing performance 10-fold.
  • Designed the ETL process to Extract translates and load data from OLTP Oracle database system to Teradata data warehouse.
  • Write SQL and Python code to build analytical support applications, Data Marts and Data Warehouses.

Show More

554 Data Warehouse Jobs

No jobs at selected location

37. Matlab
demand arrow
low Demand
Here's how Matlab is used in Data Scientist jobs:
  • Improved model performance by over 78% to the baseline model using R and MATLAB.
  • Review current technology for storing and processing Big Data including Hadoop, SPARK, MATLAB, and IBM BigInsights.
  • Developed software based analysis technique to detect water toxins in a closed-loop water purification plant (C++/Matlab).
  • Performed quality assurance analysis on spectrometer performance and accuracy Used: Matlab, Excel, SQL, Python
  • Implement the methodology using MATLAB, Python and R * Cross-modality comparison and cross-validation.
  • Used R statistical software, Matlab, and Octave.
  • Designed and implemented an instrumented glove for assessment of spasticity with Python and Matlab.
  • TOOL EVALUATION: Evaluated utilization of Qlik View, Hyperion, MicroStrategy, Demantra, Matlab, MiniTab, etc.

Show More

7 Matlab Jobs

No jobs at selected location

38. Linux
demand arrow
low Demand
Here's how Linux is used in Data Scientist jobs:
  • Improved scoring algorithm using statistical factor analysis methodology o Deployed and maintained the hosting server in a Linux environment
  • Developed and designed SQL procedures and Linux shell scripts for data export/import and for converting data.
  • Developed a Linux and Windows python application for HR software to measure employee productivity.
  • Work with all programs including Microsoft office suite (Advanced Excel) in Linux/Unix.
  • Supported Apache Tomcat web server on Linux Platform.
  • Used Python/Cron to create a class on our Linux server that refreshed studio facing Tableau dashboards automatically and efficiently.
  • Created prototypes for IT using Python, bash scripts, Linux/cygwin utilities, C/C++ and Java.
  • Created visualization using R Skills Used Environment: Linux, Hadoop, MySQL, R, R-Studio
  • Developed Linux/C++ code to perform text analysis and topic classification of 737 systems logbook data.
  • Participated in installation of SAS/EBI on LINUX platform Extensively worked on DataModeling tools ErwinDataModeler to design the datamodels.
  • Developed LINUXShell scripts by using NZSQL/NZLOAD utilities to load data from flat files to Netezza database.
  • Developed Spark/Scala, Python for regular expression (regex) project in the Hadoop/Hive environment with Linux/Windows for big data resources.

Show More

115 Linux Jobs

No jobs at selected location

39. Machine Learning Techniques
demand arrow
low Demand
Here's how Machine Learning Techniques is used in Data Scientist jobs:
  • Provided statistical support including statistical modeling and machine learning techniques for internal research and provided recommendations based on the results.
  • Identified areas of improvement in existing business by unearthing insights by analyzing vast amount of data using machine learning techniques.
  • Machine learning techniques like generalized linear regression, logistic regression, clustering algorithms, and other supervised classification algorithms.
  • Used supervised machine learning techniques such as Logistics Regression and Decision Tree Classification
  • Implemented 5 machine learning techniques, evaluated their accuracy, and improved their predictive accuracy from 60% to 80%.
  • Applied different Machine Learning techniques for customer insight, target marketing, channel execution, risk management, and business strategy.
  • Used Machine Learning techniques to analyze complex interactions among players to help drive intelligent decision-making regarding player churn rate.
  • Applied data analysis and machine learning techniques through Excel, Python, and R programming.
  • Use of cutting edge data mining, machine learning techniques for building advanced customer solutions.
  • Developed models to identify high value users and their lookalikes using advanced machine learning techniques.
  • Utilized machine learning techniques for predictions & forecasting based on the training data.
  • Perform statistical analysis and apply machine learning techniques to large amount of data for identifying potential areas of enhancement in products.
  • supervised & unsupervised machine learning techniques.
  • Prepared large volumes of user history data and performed ETL with Hadoop and applied above-mentioned machine learning techniques using Mahout.
  • Performed descriptive and predictive data analytics using machine learning techniques in R and/or Python.
  • Implemented regression and machine learning techniques to study and model ticket buyer s behaviour.
  • research, contract) Researched and implemented machine learning techniques to predict phrase usage in mathematical publications.

Show More

40. Informatica
demand arrow
low Demand
Here's how Informatica is used in Data Scientist jobs:
  • Created high level ETL design document and assisted ETL developers in the detail design and development of ETL maps using Informatica.
  • Used ETL methodology for supporting data extraction, transformations and loading processing, in a complex MDM using Informatica.
  • Hand on development and maintenance using Oracle SQL, SQL Loader, PL/SQL and Informatica Power Center9.1.
  • Focus on integration overlap and Informatica newer commitment to MDM with the acquisition of Identity Systems.
  • Design, coding, unit testing of ETL package source marts and subject marts using Informatica ETL processes for Oracledatabase.
  • Implemented complex business rules in Informatica Power Center by creating re-usable transformations, and robust Mapplets.
  • excel files into Vertica via ksh and informatica Manipulate and analyze data in Vertica database i.e.

Show More

218 Informatica Jobs

No jobs at selected location

41. A/B
demand arrow
low Demand
Here's how A/B is used in Data Scientist jobs:
  • Coordinated the execution of A/B tests to measure the effectiveness of personalized recommendation system.
  • Build infrastructures to simulate data, design and implement A/B tests, and report effects of treatments.
  • Designed and analyzed A/B tests on email coupon templates, improving click through rate by 30%.
  • Leverage modeling insights to design and conduct A/B test to evaluate performance impact on the design features.
  • Utilized A/B/N tests recorded in no-SQL search logs to evaluate new features in search engine results.
  • Conduct experiment design, A/B testing, time series analysis, survey design, etc.
  • Designed A/B testing frameworks to test efficacy of products and interventions designed to help students.
  • Designed A/B tests, created and interpreted post-campaign analytic reports for addressable campaigns for clients.
  • Perform A/B Testing using different variants on website and compare web traffic and conversions.
  • Designed a schema to analyze A/B tests, and created automated reporting and alerting.
  • Used A/B test and Hypothesis test to check the accuracy of the model.
  • Experienced with User Engagement Modeling, Data Pipelines, and A/B testing.
  • Executed A/B and multivariate tests to optimize web analyst performance.
  • Design and report on A/B split tests.
  • Analyzed historical performance and administered A/B tests to examine viewer interactions.
  • Provided in-depth analysis on new data sets, complex web analytics, and key support and feedback on A/B campaign testing.
  • Calculated RMSE score, F-SCORE, PRECISION, RECALL, and A/B testing to evaluate recommender's performance.
  • Participated in implementation and using of A/B testing functionality in the core part of Experian web projects.
  • Identify feature importance and segmentation(per program) of a A/B testing email campaign - R
  • Designed and analyzed A/B and multi-armed tests to optimize click-through and conversion rates.

Show More

42. Sentiment Analysis
demand arrow
low Demand
Here's how Sentiment Analysis is used in Data Scientist jobs:
  • Developed sentiment analysis and text classification techniques for analyzing large volumes for financial news, with application to financial markets.
  • Applied statistical models to understand student behavior and performance, and applied sentiment analysis to understand challenges faced by students
  • Designed, programmed, and implemented into production a novel ensemble technique for sentiment analysis on unstructured social media.
  • Designed a natural language processing pipeline for sentiment analysis and topic extraction of social media postings.
  • Developed and implemented in R a text classification for sentiment analysis of twitter data.
  • Perform sentiment analysis and gathering insights from large volumes of unstructured data.
  • Designed and developed Natural Language Processing models for sentiment analysis.
  • Major engineer developing algorithm for text mining and sentiment analysis.
  • Designed and developed NLP models for sentiment analysis.
  • Created sentiment analysis dashboard for Human Resources.
  • Structured twitter post data by topic and affect using a sentiment analysis engine to test compensatory control theories in social psychology.
  • Ensured that the model has low False Positive Rate and Text classification and sentiment analysis for unstructured and semi-structured data.
  • Assisted a senior colleague in developing a Sentiment Analysis application in Spark using its Python API.
  • Used NLP to construct sentiment analysis and social networks of characters in Harry Potter books.
  • Project for natural language processing (NLP) and sentiment analysis using Python.
  • Perform sentiment analysis and identify patterns in comments from lost account surveys.
  • Conduct user research through social media sentiment analysis and data mining.
  • Research on sentiment analysis of social media text.
  • Conducted Sentiment Analysis using Python-NLTK and R.
  • Perform sentiment analysis, clustering & associations and relate it to KPIs.

Show More

43. Hbase
demand arrow
low Demand
Here's how Hbase is used in Data Scientist jobs:
  • Involved in extracting and cleansing the data and defining quality process for the builds to load data into HBASE.
  • Verified the data in HBASE environment.
  • Worked with SQL and NoSQL databases such as Hbase and visualized the insights gathered using BI tools such as Tableau.
  • Created HBase tables to store variable data formats coming from different portfolios.
  • Import the data from different sources like HDFS/Hbase into Spark RDD.
  • Hive, Pig, HBase and Spark.
  • Worked on loading the data from MySQL to HBase where necessary using Sqoop.
  • Used Sqoop commands to load Hive, HBase from Oracle and Oracle and automated using Unix shell scripts and Oozie.

Show More

456 Hbase Jobs

No jobs at selected location

44. XML
demand arrow
low Demand
Here's how XML is used in Data Scientist jobs:
  • Explored and Extracted data from source XML in HDFS, used ETL for preparing data for exploratory analysis using data munging.
  • Involved in designing and implementing the data extraction (XML DATA stream) procedures.
  • Designed an automation process that parsing XML financial data to Database.
  • Modify the DB2 table and API to be able to save an incomplete bond as an XML back to AIX DB2.
  • Developed tools using Python, Shell scripting, XML to automate some of the menial tasks.

Show More

42 XML Jobs

No jobs at selected location

45. Nltk
demand arrow
low Demand
Here's how Nltk is used in Data Scientist jobs:
  • Created a service in Python for Entity extracting using NLTK.
  • Analyzed tweets using nltk and tf-idf to understand common topics in tweets.

Show More

46. Business Requirements
demand arrow
low Demand
Here's how Business Requirements is used in Data Scientist jobs:
  • Gathered and understood business requirements and transformed them into data warehousing solutions for ISO Insurance Customer data.
  • Evaluated business requirements and prepared detailed specifications that follow project guidelines required to develop written programs.
  • Gathered, analyzed and translated business requirements into relevant analytic approaches and shared for peer review.
  • Work closely with business team to understand business requirements and conduct data preparation regarding the requirements.
  • Analyzed the business requirements of the project by studying the Business Requirement Specification document.
  • Experience working in Data Requirement analysis for transforming data according to business requirements.
  • Gather complex business requirements and translates into technical requirements.
  • Evaluated big data solutions relative to business requirements.
  • Developed reports as per business requirements and created various reports like summary reports, tabular reports, excel reports etc.
  • Communicate with different Stakeholders, Business Groups, and field User Groups to elicit and to analyze business requirements.
  • Worked with project team to understand the problem and business requirements.
  • Analyzed business requirements, translated it into SAS and SQL.
  • Gathered, analyzed, and documented the business requirements.
  • Use Hadoop/hive to code complex logics of various business requirements.

Show More

134 Business Requirements Jobs

No jobs at selected location

47. Json
demand arrow
low Demand
Here's how Json is used in Data Scientist jobs:
  • Worked on different data formats such as JSON, XML and performed machine learning algorithms in R and Python.
  • Parsed JSON formatted twitter data and uploaded to database.
  • Used MongoDB to accept JSON pings and output new Fraud predictions on a web-app interface.
  • Scraped and retrieved web data as JSON using Scrapy, presented with Pandas library.
  • Updated elasticsearch-tableau connector to handle nested JSON objects and published to GitHub.
  • Used JavaScript and JSON to update a portion of a webpage.
  • Placed data into JSON files using Python to test Django websites.
  • Used mongoDB as back end to store the text data as json scripts.

Show More

28 Json Jobs

No jobs at selected location

48. PCA
demand arrow
low Demand
Here's how PCA is used in Data Scientist jobs:
  • Applied Principal Component Analysis (PCA) for data reduction and to create Key Performance Indicators (KPI).
  • Joined PCA scores with factory equipment data to identify process tools responsible for yield issues.
  • Generalized by feature engineering, grid-searching, and PCA.

Show More

1 PCA Jobs

No jobs at selected location

49. Spss
demand arrow
low Demand
Here's how Spss is used in Data Scientist jobs:
  • Developed HVAC classification algorithms using SPSS for decision support.
  • Create custom visualization and data exploration solutions in R and SPSS to query databases and display data to subject matter experts.
  • Trained in Basics of Data Scientist and implemented those software applications in collecting and managing patient data in Excel/SPSS.
  • Perform research and analysis using SAS, SPSS and Excel and develop client presentations for data analysis projects.
  • Introduced, designed and integrated R modeling into existing SPSS data mining process.
  • Augmented process for R model deployment & scoring on existing SPSS scoring engine.
  • Develop statistical models using software such as R, SPSS, and STATA.
  • Used SPSS and SQL Server to process data and handle exceptions.
  • Predicted annual expense to within 0.3% using SPSS.
  • Refined time-series data and validated mathematical models using analytical tools like R and SPSS to reduce forecasting errors.
  • Date entry was performed in MS Excel and recoded in IBM SPSS for further analysis.
  • Generated ad-hoc or management specific reports using SSRS, SPSS, and Excel.
  • Utilized SPSS and Minitab software to randomize, analyze and interpretation of data.
  • Operate Models and correlate Models result, Utilize Python, R, Rstudio, Qlikview, SPSS, and SQL Postgres.

Show More

26 Spss Jobs

No jobs at selected location

50. Mahout
demand arrow
low Demand
Here's how Mahout is used in Data Scientist jobs:
  • Used Mahout and collaborative filtering to build predictive models, which were used to optimize ad campaign performance.
  • Implemented end-to-end systems for DataAnalytics, DataAutomation and integrated with custom visualization tools using R, Mahout, Hadoop and MongoDB.
  • Designed and conducted two successful trainings on Mahout (platform for machine learning on big data).
  • Machine learning: user churn modelling (PoC Mahout) +Technical environment:.

Show More

Data Scientist Jobs

NO RESULTS

Aw snap, no jobs found.

20 Most Common Skills For A Data Scientist

R

11.4%

Data Analysis

9.9%

Pl/Sql

9.5%

Python

9.5%

Analytics

9.3%

Algorithms

7.1%

Hadoop

5.0%

Big Data

3.9%

Web Application

3.7%

SAS

3.4%

Data Visualization

3.4%

Data Science

3.3%

Predictive Models

3.2%

Logistic Regression

3.1%

ETL

2.9%

SQL

2.4%

Support Vector Machines

2.3%

Hdfs

2.2%

Natural Language Processing

2.2%

Neural Networks

2.1%
Show More

Typical Skill-Sets Required For A Data Scientist

Rank Skill
1 R 8.1%
2 Data Analysis 7.1%
3 Pl/Sql 6.8%
4 Python 6.8%
5 Analytics 6.7%
6 Algorithms 5.1%
7 Hadoop 3.6%
8 Big Data 2.8%
9 Web Application 2.7%
10 SAS 2.4%
11 Data Visualization 2.4%
12 Data Science 2.4%
13 Predictive Models 2.3%
14 Logistic Regression 2.2%
15 ETL 2.1%
16 SQL 1.7%
17 Support Vector Machines 1.6%
18 Hdfs 1.6%
19 Natural Language Processing 1.6%
20 Neural Networks 1.5%
21 Pandas 1.4%
22 K-Means 1.4%
23 AWS 1.3%
24 Data Quality 1.3%
25 Scikit-Learn 1.2%
26 Numpy 1.2%
27 Teradata 1.1%
28 Mapreduce 1.1%
29 Mongodb 1.0%
30 Large Data 1.0%
31 Decision Trees 1.0%
32 API 1.0%
33 Linear Regression 1.0%
34 Data Collection 1.0%
35 BI 1.0%
36 Data Warehouse 0.9%
37 Matlab 0.8%
38 Linux 0.8%
39 Machine Learning Techniques 0.8%
40 Informatica 0.8%
41 A/B 0.8%
42 Sentiment Analysis 0.8%
43 Hbase 0.8%
44 XML 0.8%
45 Nltk 0.7%
46 Business Requirements 0.7%
47 Json 0.7%
48 PCA 0.7%
49 Spss 0.7%
50 Mahout 0.6%
{[{skill.rank}]} {[{skill.name}]} {[{skill.percentageDisplay}]}%
Show More

14,291 Data Scientist Jobs

Where do you want to work?

To get started, tell us where you'd like to work.
Sorry, we can't find that. Please try a different city or state.