Workflows

Search filter terms
Filter by type
Filter by tag
Filter by user
Filter by licence
Filter by group
Results per page:
Sort by:
Showing 294 results. Use the filters on the left and the search box below to refine the results.
Uploader

Workflow SVD user-based collaborative filtering rec... (1)

Thumb
This workflow takes user-item matrix A as a input. Then it calculates reduced SVD decomposition A_k by taking only k greatest singular values and corresponding singular vectors. This worfkflow calculates recommendations and predictions for particular user %{id} from matrix A. Particular row %{id} is taken from original matrix A and replaced with %{id} row in A_k matrix. Predictions are made for %{id} user based on another users A_k. Note: This workflow uses R-script operator with R library ...

Created: 2011-05-09 | Last updated: 2011-05-09

Credits: User Ninoaf User Matko Bošnjak

Attributions: Workflow User-based collaborative filtering recommender system template Blob Datasets for the pack: RCOMM2011 recommender systems workflow templates

Workflow Example Process using the RapidMiner Linke... (1)

Thumb
This example process uses the operator SPARQL Data Importer provided by the RapidMiner LOD Extension to receive additional data about books including the author, isbn, country, abstract, number of pages and language from dbpedia (www.dbpedia.org).

Created: 2014-12-04

Workflow Using Graph Kernels for Feature Generation... (1)

Thumb
This example shows how to use graph kernels for feature generation.In this example we use the Root RDF Walk Count Kernel, and the Fast RDF WL Sub Tree Kernel.The input data for the process can be found here.More information about the process can be found here.

Created: 2015-05-04 | Last updated: 2015-05-04

Workflow Finding wrong links in Linked Open Data wi... (1)

Thumb
This simple workflow shows how to detect wrong links in Linked Open Data. It uses operators from the Linked Open Data and the Anomaly Detection extensions. The process first reads a list of links from the EventMedia endpoint, linking to DBpedia, then creates feature vectors for each of those links. Finally, an outlier detection operator is employed to find suspicious links.

Created: 2014-03-14

Uploader

Workflow Semantic meta-mining workflow that perform... (1)

Thumb
Performs a crossvalidation on a data set composed of meta-data of baseline RapidMiner workflows expressed in RDF with the DMOP's ontology terminology for representing processes. Includes discovery of a set of semantic features (patterns) by the Fr-ONT-Qu algorithm (‘workflow patterns’). Through propositionalisation approach those features may be used in an arbitrary (propositional) RapidMiner classification operator.

Created: 2012-03-05 | Last updated: 2012-03-05

Workflow 1. Getting Started: Learn and Store a Model (1)

Thumb
This getting started process shows the first step of learning and storing a model. After a model is learned, you can load (Retrieve operator) the model and apply it to a test data set (see 2. Getting Started: Retrieve and Apply Model). The process is NOT concerned with evaluation of the model. This process will not immediately run in RapidMiner because you have to adjust the repository path in the Retrieve operator. Tags: Rapidminer, model, learn, learning, training, train, store, first step

Created: 2011-01-17 | Last updated: 2011-01-17

Workflow Hybrid Recommender System Using Linked Op... (1)

Thumb
This process is using the Rapid Miner Linked Open Data extension and the Recommender extension, to build a hybrid Linked Open Data enabled recommender system for books.The input data for the process can be found here.More information about the process can be found here.

Created: 2014-05-15 | Last updated: 2014-05-15

Workflow Mining Semantic Web data using FastMap - E... (1)

Thumb
This workflow describes how to learn from the Semantic Web's data. The input to the workflow is a feature vector developed from a RDF resource. The loaded example set is then divided into training and test parts. These sub-example sets are used by the FastMap operators (encapsulate the FastMap data transformation technique), which processes each feature at a time and transform the data into a different space. This transformed data is more meaningful and helps the learner to improve classfica...

Created: 2011-06-25 | Last updated: 2011-06-25

Workflow Mining Semantic Web data using FastMap - R... (1)

Thumb
This workflow will explain that how an example set can be extracted from an RDF resource using the provided SPARQL query. This example set is then divided into training and test parts. These sub-example sets are used by the FastMap operators (encapsulate the FastMap data transformation technique), which processes each feature at a time and transform the data into a different space. This transformed data is more meaningful and helps the learner to improve classfication peformance. The tranfo...

Created: 2011-06-25 | Last updated: 2011-06-25

Workflow Mining Semantic Web data using Corresponde... (1)

Thumb
This workflow will explain that how an example set can be extracted from an RDF resource using the provided SPARQL query. This example set is then divided into training and test parts. These sub-example sets are used by the Correspondencce Analysis operators (encapsulate the Correspondencce Analysis data transformation technique) which processes each feature at a time and transform the data into a different space. This transformed data is more meaningful and helps the learner to improve clas...

Created: 2011-06-25 | Last updated: 2011-06-25

Results per page:
Sort by: