Workflows

Search filter terms
Filter by type
Filter by tag
Filter by user
Filter by licence
Filter by group
Results per page:
Sort by:
Showing 285 results. Use the filters on the left and the search box below to refine the results.
Type: RapidMiner

Workflow RCOMM 2012 Sudoku Challenge: 2 - Drop impo... (1)

Thumb
Process from the http://rcomm2012.org live data mining challenge. The task was to partially solve a Sudoku puzzle by solving three subtasks. Processes 1-3 in this pack are the actual solutions to the tasks, process 0 loads the required data into your repository and process 4 is the bonus that solves the entire puzzle. IMPORTANT: Save this process as "RCOMM 2012 Sudoku Challenge: 2 - Drop impossible values" since it is included from process 3. This process eliminates impossible rows from the...

Created: 2012-09-04

Workflow RCOMM 2012 Sudoku Challenge: 1 - Generate ... (1)

Thumb
Process from the http://rcomm2012.org live data mining challenge. The task was to partially solve a Sudoku puzzle by solving three subtasks. Processes 1-3 in this pack are the actual solutions to the tasks, process 0 loads the required data into your repository and process 4 is the bonus that solves the entire puzzle. This process loads the numbers 1..9 and uses two Cartesian Product operators to compute the set of all combinations x, y, v. Finally we compute the index of the 3x3 sub-field a...

Created: 2012-09-04

Workflow RCOMM 2012 Sudoku Challenge: 0 - Get Data (1)

Thumb
Process from the http://rcomm2012.org live data mining challenge. The task was to partially solve a Sudoku puzzle by solving three subtasks. Processes 1-3 in this pack are the solutions to the actual tasks, process 0 loads the required data into your repository and process 4 is the bonus that solves the entire puzzle. This process loads the two required data sets from rapid-i.com and stores them in your repository. The first data set contains the numbers 1 to 9 and the second contains the pr...

Created: 2012-09-04

Uploader

Workflow Online update recommendation web service V2 (1)

Thumb
 After some arbitrary number of recommendations to specific users, system has to update recommendations in item recommendation table. This is accomplished by calling the online update recommendation web service, which updates the recommendation model in RapidAnalytics repository and updates the recommendations for specific users in item recommendation table.

Created: 2012-06-20 | Last updated: 2012-06-20

Uploader

Workflow Offline update recommendation web service V2 (1)

Thumb
  Periodically we have to do a full re-training on whole train set by the offline update recommendation web service.

Created: 2012-06-20 | Last updated: 2012-06-20

Uploader

Workflow TextualAttributeExtraction for Recommender... (1)

Thumb
This workflow takes conent data from VideoLectures.Net Recommender System Challenge and extractes word-vectors for each lecture. Latent semantic analysis with Singular Value Decomposition is done on item-word binary matrix. The last step is the binomializaiton of dataset.

Created: 2012-06-03

Workflow Evaluating semantic kernel with k-NN class... (1)

Thumb
This workflow uses k-NN classifier to evaluate quality of EL++ Convolution Kernel [1]. As a dataset one of the examples from DL-Learner project [2] is used. After preparing knowledge base with "Build Knowledge Base", the item to item distance matrix is computed with "Calculate Gram/Distance Matrix". Such a matrix is then used as an input to 10-fold cross-validation with k-NN as an classifier and average result is delivered. [1] L. Józefowski, A. Lawrynowicz, J...

Created: 2012-05-30 | Last updated: 2012-06-07

Workflow Semantic clustering with k-Medoids and ALC... (1)

Thumb
 This workflow loads data from a configuration file for DL-Learner (http://dl-learner.org) and uses ALCN Semantic Kernel [1] to cluster those data with k-Medoids algorithm. [1] N. Fanizzi, C. d’Amato, F. Esposito. Learning with Kernels in Description Logics. ILP 2008  

Created: 2012-05-30 | Last updated: 2012-06-07

Workflow Clustering data from DBpedia using AHC (1)

Thumb
 This workflow uses Aglomerative Hierarchical Clustering algorithm to build hierarchy of clusters on data downloaded from DBpedia, the semantic version of Wikipedia. RDF data are downloaded from SPARQL endpoint and merged with DBpedia ontology by "Build Knowledge Base". Set of items to cluster is selected with "SPARQL Selector" and later they are clustered by "Agglomerative Hierarchical Clustering" with distance measure induced by the Bloehdorn Kernel [1]. ...

Created: 2012-05-30 | Last updated: 2012-06-07

Workflow Hybrid recommendation system (1)

Thumb
This is one hybrid recommendation system combining linear regression recommender, created using RapidMiner core operators, and Recommender extension multiple collaborative filtering and attribute based operators.

Created: 2012-05-17

Credits: User Matej Mihelčić User Matko Bošnjak

Results per page:
Sort by: