Workflows

Search filter terms
Filter by type
Filter by tag
Filter by user
Filter by licence
Filter by group
Results per page:
Sort by:
Showing 294 results. Use the filters on the left and the search box below to refine the results.

Workflow RapidMiner process created by the MLWizard... (1)

Thumb
This process was created by the MLWizard Extension. Start RapidMiner Studio, download the extension from the RapidMiner Marketplace, open the menu Tools and click on "Automatic System Construction". A wizard opens, which suggests some models which fit best to build a model from your data. You can choose the one which has the most accuracy and the wizard creates the RapidMiner process for you.

Created: 2014-12-04

Uploader

Workflow RapidMiner: Use a Macro at Detect Outliers... (1)

Thumb
a reply to @paavopdf 's question on twitter "Anyone having an idea how I can use a macro at "Detect Outliers" for the no. outliers in @RapidMiner? #rapidminer #DataScience"

Created: 2016-10-03

Workflow WebCrawl-RapidMiner (1)

Thumb
WebCrawl-RapidMiner

Created: 2013-08-06

Workflow Testing Rapidminer interface (1)

Thumb
<table><tr><td><p>This process starts with loading the data. After finishing the input operator a typical learning step is performed. Here, an implementation of a decision tree learner is used which also can handle numerical values (similar to the well known C4.5 algorithm).</p></td><td><icon>groups/24/learner</icon></td></tr></table><p>Each operator may demand some input and delivers some output. These in- and outp...

Created: 2011-02-08

Workflow Image Mining with RapidMiner (1)

Thumb
This is an image mining process using the image mining Web service provided by NHRF within e-Lico. It first uploads a set of images found in a directory, then preprocesses the images and visualizes the result. Furthermore, references to the uploaded images are stored in the local RapidMiner repository so they can later be used for further processing without uploading images a second time.

Created: 2010-04-28 | Last updated: 2012-01-16

Workflow Execution time measurement for Rapidminer ... (1)

Thumb
This is a workflow that measures execution times of various RapidMiner operators on random polynomial classification datasets of various sizes in the range of 4 to 2000 examples and attributes.

Created: 2012-05-11 | Last updated: 2012-05-11

Workflow Example Process RapidMiner Multimedia Mini... (1)

Thumb
This example process can be used with the Multimedia Extension (http://www.burgsys.com/) in RapidMiner Studio. It shows how to create a QR code with the extension, and how to make image transformations.

Created: 2014-12-04 | Last updated: 2014-12-04

Uploader

Workflow MythMiner - Recommendation system for Myth... (1)

Thumb
This process creates recommendations from the MythTV PVR database. Programs which have been recorded are assumed to be interesting, those not recorded to be not interesting. The model is then applied to future program data. Interesting programs are saved in an HTML file which can be mailed to the user. This process has a few dependencies, please note homepage: http://tud.at/programm/mythminer/

Created: 2011-01-30

Workflow A simple process that demonstrates how to ... (1)

Thumb
This simple RapidMiner process demonstrates how to use the Open File operator introduced in RapidMiner 5.2. In this example we use the operator to consume a data feed from the web.

Created: 2012-09-27

Workflow Example process of the Text and Web Mining... (1)

Thumb
This example process crawls the web (RapidMiner forum) for entries, extracts the information with the Process Documents operator and applies Clustering on the results. The process shows the interaction between the Web Mining Extension and the Text Mining Extension from RapidMiner.

Created: 2014-12-04

Uploader

Workflow SVD user-based collaborative filtering rec... (1)

Thumb
This workflow takes user-item matrix A as a input. Then it calculates reduced SVD decomposition A_k by taking only k greatest singular values and corresponding singular vectors. This worfkflow calculates recommendations and predictions for particular user %{id} from matrix A. Particular row %{id} is taken from original matrix A and replaced with %{id} row in A_k matrix. Predictions are made for %{id} user based on another users A_k. Note: This workflow uses R-script operator with R library ...

Created: 2011-05-09 | Last updated: 2011-05-09

Credits: User Ninoaf User Matko Bošnjak

Attributions: Workflow User-based collaborative filtering recommender system template Blob Datasets for the pack: RCOMM2011 recommender systems workflow templates

Workflow Example Process using the RapidMiner Linke... (1)

Thumb
This example process uses the operator SPARQL Data Importer provided by the RapidMiner LOD Extension to receive additional data about books including the author, isbn, country, abstract, number of pages and language from dbpedia (www.dbpedia.org).

Created: 2014-12-04

Workflow Using Graph Kernels for Feature Generation... (1)

Thumb
This example shows how to use graph kernels for feature generation.In this example we use the Root RDF Walk Count Kernel, and the Fast RDF WL Sub Tree Kernel.The input data for the process can be found here.More information about the process can be found here.

Created: 2015-05-04 | Last updated: 2015-05-04

Workflow Finding wrong links in Linked Open Data wi... (1)

Thumb
This simple workflow shows how to detect wrong links in Linked Open Data. It uses operators from the Linked Open Data and the Anomaly Detection extensions. The process first reads a list of links from the EventMedia endpoint, linking to DBpedia, then creates feature vectors for each of those links. Finally, an outlier detection operator is employed to find suspicious links.

Created: 2014-03-14

Uploader

Workflow Semantic meta-mining workflow that perform... (1)

Thumb
Performs a crossvalidation on a data set composed of meta-data of baseline RapidMiner workflows expressed in RDF with the DMOP's ontology terminology for representing processes. Includes discovery of a set of semantic features (patterns) by the Fr-ONT-Qu algorithm (‘workflow patterns’). Through propositionalisation approach those features may be used in an arbitrary (propositional) RapidMiner classification operator.

Created: 2012-03-05 | Last updated: 2012-03-05

Workflow 1. Getting Started: Learn and Store a Model (1)

Thumb
This getting started process shows the first step of learning and storing a model. After a model is learned, you can load (Retrieve operator) the model and apply it to a test data set (see 2. Getting Started: Retrieve and Apply Model). The process is NOT concerned with evaluation of the model. This process will not immediately run in RapidMiner because you have to adjust the repository path in the Retrieve operator. Tags: Rapidminer, model, learn, learning, training, train, store, first step

Created: 2011-01-17 | Last updated: 2011-01-17

Workflow Hybrid Recommender System Using Linked Op... (1)

Thumb
This process is using the Rapid Miner Linked Open Data extension and the Recommender extension, to build a hybrid Linked Open Data enabled recommender system for books.The input data for the process can be found here.More information about the process can be found here.

Created: 2014-05-15 | Last updated: 2014-05-15

Workflow Mining Semantic Web data using FastMap - E... (1)

Thumb
This workflow describes how to learn from the Semantic Web's data. The input to the workflow is a feature vector developed from a RDF resource. The loaded example set is then divided into training and test parts. These sub-example sets are used by the FastMap operators (encapsulate the FastMap data transformation technique), which processes each feature at a time and transform the data into a different space. This transformed data is more meaningful and helps the learner to improve classfica...

Created: 2011-06-25 | Last updated: 2011-06-25

Workflow Mining Semantic Web data using FastMap - R... (1)

Thumb
This workflow will explain that how an example set can be extracted from an RDF resource using the provided SPARQL query. This example set is then divided into training and test parts. These sub-example sets are used by the FastMap operators (encapsulate the FastMap data transformation technique), which processes each feature at a time and transform the data into a different space. This transformed data is more meaningful and helps the learner to improve classfication peformance. The tranfo...

Created: 2011-06-25 | Last updated: 2011-06-25

Workflow Mining Semantic Web data using Corresponde... (1)

Thumb
This workflow will explain that how an example set can be extracted from an RDF resource using the provided SPARQL query. This example set is then divided into training and test parts. These sub-example sets are used by the Correspondencce Analysis operators (encapsulate the Correspondencce Analysis data transformation technique) which processes each feature at a time and transform the data into a different space. This transformed data is more meaningful and helps the learner to improve clas...

Created: 2011-06-25 | Last updated: 2011-06-25

Workflow Transaction Analysis Demo from RM 5 Intro Day (1)

Thumb
This is the demo process presented at the RapidMiner 5 Intro Day. It combines customer segmentation with direct mailing. It loads some transaction data, aggregates and pivotes the data so it can be used by a clustering to perform a customer segmentation. Then, additional data is joined with the clustered data. First, response/no-response data is joined, and them some additional information about the users is added. Finally, customers are classified into response/no-response classes. The dat...

Created: 2010-04-30 | Last updated: 2010-05-05

Workflow Mining Semantic Web data using Corresponde... (1)

Thumb
This workflow describes how to learn from the Semantic Web's data using a data transformation algorithm 'Correspondence Analysis'. The input to the workflow is a feature vector developed from a RDF resource. The loaded example set is divided into training and test parts. These sub-example sets are used by the Correspondence Analysis operators (encapsulate the Correspondence Analysis data transformation technique) which processes each feature at a time and transform the data into a different...

Created: 2011-06-25 | Last updated: 2011-06-25

Workflow KNN-FeatureSelection-INCAE (1)

Thumb
This process adapts one of the templates available in RapidMiner 5 to include some preprocessing.

Created: 2011-10-28

Workflow Hybrid recommendation system (1)

Thumb
This is one hybrid recommendation system combining linear regression recommender, created using RapidMiner core operators, and Recommender extension multiple collaborative filtering and attribute based operators.

Created: 2012-05-17

Credits: User Matej Mihelčić User Matko Bošnjak

Workflow Example Process using the WHIBO Extension (1)

Thumb
This example shows simply the operator Generic decision tree of the WHIBO extension for RapidMiner Studio, which allows the creation and usage of an individual Decision Tree algorithm.

Created: 2014-12-04

Workflow Plot round results of a backward elimination (1)

Thumb
This process will perform a backward elimination and logs all performance and deviation results of each round. This way, you can use the visualizations of rapidminer to asses the performance gain.

Created: 2010-05-26

Uploader

Workflow Meta-mining workflow that performs crossva... (1)

Thumb
Performs a crossvalidation on a data set composed of baseline RapidMiner workflows described with dataset characteristics used by the given workflow and the learning algorithm used in the given workflow.

Created: 2012-03-05 | Last updated: 2012-03-05

Workflow Execute Program on Windows 7 (1)

Thumb
This simple process demonstrates how to execute a program on windows 7 even if the program path contains spaces. The process will start the Internet Explore if the path exists. Tags: Rapidminer, Execute Program, Windows 7

Created: 2010-06-09

Workflow CHART based Feature Weightage (1)

Thumb
Features can be assigned weightage through the decision tree model. In this regard, RapidMiner's Auto Model comes quite handy. Divide the original data into training and testing datasets before applying the workflow to it. 

Created: 2020-06-30 | Last updated: 2020-06-30

Credits: User Imran Ali Syed

Workflow Gradient Boosting Trees based Feature Weig... (1)

Thumb
Features can be assigned weightage through the gradient boosting trees model. In this regard, RapidMiner's Auto Model comes quite handy. Divide the original data into training and testing datasets before applying the workflow to it. 

Created: 2020-06-30 | Last updated: 2020-06-30

Credits: User Imran Ali Syed

Workflow Random Forest based Feature Weightage (1)

Thumb
Features can be assigned weightage through the random forest model. In this regard, RapidMiner's Auto Model comes quite handy. Divide the original data into training and testing datasets before applying the workflow to it.  

Created: 2020-06-30 | Last updated: 2020-06-30

Credits: User Imran Ali Syed

Workflow RCOMM Challenge 2: Broken Iris (1)

Thumb
At the RComm 2010 (www.rcomm2010.org), an unusual competition was held. Titled "Who Wants to Be a Data Miner", three challenges were issued to the participants of the conference. In all challenges, participants had to design RapidMiner processes as quickly as possible. This is the winning process of Challenge 2: "Broken Iris" by Nico Piatkowski. This was the task: You are given a decision tree model (M) designed on the well-known Iris data set and unlabelled data (U) on which the model is t...

Created: 2010-09-17

Uploader

Workflow X-Validation with One-Class SVM (1)

Thumb
With this example process, I intend to show how one might validate predictions done with a one class classifier. RapidMiner includes the One-Class SVM (part of libsvm) as introduced by Schoelkopf. I'm very interested in feedback concerning problems with or errors in this experiment. Note that the data set (Sonar) is just a toy data set chosen for demonstration - learning a one class classifier on it won't give you any good results! DESCRIPTION The basic idea is to partition the data set...

Created: 2010-10-20 | Last updated: 2010-10-20

Uploader

Workflow Import of the repository of RapidMiner wor... (1)

Thumb
No description

Created: 2012-03-05 | Last updated: 2012-03-05

Workflow Stacking (1)

Thumb
RapidMiner supports Meta Learning by embedding one or several basic learners as children into a parent meta learning operator. Here, we use a three base learners inside the stacking operator: decision tree induction, linear regression, and a nearest neighbours classifier. Finally, a Naive Bayes learner is used as a stacking learner which uses the predictions of the preceeding three learners to make a combined prediction.

Created: 2010-04-29

Workflow Item-based collaborative filtering recomme... (1)

Thumb
The workflow for item-based collaborative filtering receives a user-item matrix for its input, and the same context defined macros as the user-based recommender template, namely %{id}, %{recommendation_no}, and %{number_of_neighbors}. Although this process is in theory very similar to user-based technique, it differs in several processing steps since we are dealing with an item-user matrix, the transposed user-item example set. The first step of the workflow, after declaring zero values miss...

Created: 2011-05-05 | Last updated: 2011-05-09

Credits: User Matko Bošnjak User Ninoaf

Attributions: Blob Datasets for the pack: RCOMM2011 recommender systems workflow templates

Workflow CamelCases (1)

Thumb
this process splits up camelcases

Created: 2010-06-02

Workflow Tag Clustering (TaCl) (1)

Thumb
This is a sample process for a tag clustering. See http://www-ai.cs.uni-dortmund.de/SOFTWARE/TaCl/index.html

Created: 2011-11-17 | Last updated: 2011-11-17

Workflow User-based collaborative filtering recomme... (1)

Thumb
The workflow for user-based collaborative filtering, takes only one example set as an input: a user-item matrix, where the attributes denote item IDs, and rows denote users. If a user i has rated an item j with a score s, the matrix will have the value s written in i-th row and j-th column. In the context of the process we define the ID of the user %{id}, desired number of recommendations %{recommendation_no}, and the number of neighbors used in ca...

Created: 2011-05-05 | Last updated: 2011-05-09

Credits: User Matko Bošnjak User Ninoaf

Attributions: Blob Datasets for the pack: RCOMM2011 recommender systems workflow templates

Uploader

Workflow LSI content based recommender system template (1)

Thumb
This workflow performs LSI text-mining content based recommendation. We use SVD to capture latent semantics between items and words and to obtain low-dimensional representation of items. Latent Semantic Indexing (LSI) takes k greatest singular values and left and right singular vectors to obtain matrix  A_k=U_k * S_k * V_k^T. Items are represented as word-vectors in the original space, where each row in matrix A represents word-vector of particular item. Matrix U_k, on the other hand ...

Created: 2011-05-06 | Last updated: 2011-05-09

Credits: User Ninoaf User Matko Bošnjak

Attributions: Workflow Content based recommender system template Blob Datasets for the pack: RCOMM2011 recommender systems workflow templates

Workflow Correct Attribute Type to Binominal (1)

Thumb
This process will first generate some artificial data to show a commonly known problem: Some attributes have only two values, but are not correctly stored as Binominal, instead RapidMiner recognizes them as nominal. In order to use them for some special operators, we have to change this to binominal. We can achieve this by using the Nominal to Binominal operator with the the parameter transform_binominals switched off. Please take a look at the data before and after the Nominal to Binominal o...

Created: 2010-05-14

Workflow Experimentation through repository access (1)

Thumb
This workflow reads train/test dataset from a specified RapidMiner repository and tests selected operator on that datasets. Only datasets specified with a proper regular expression are considered. Train and test data filenames must correspond e.g (train1, test1). Informations about training and testing data, performanse measures of a selected operator are stored as an Excel file. Note: Train/test file names should not be contained in the repository path. E.g training/train is not a god path,...

Created: 2012-01-31 | Last updated: 2012-02-01

Credits: User Matej Mihelčić User Matko Bošnjak User tomS

Workflow Recommender workflow (1)

Thumb
This is a main online update experimentation workflow. It consists of three Execute Process operators. First operator executes model training workflow. Second operator executes online updates workflow for multiple query update sets. The last operator executes performance testing and comparison workflow. Final performance results are saved in an Excel file.

Created: 2012-01-29

Credits: User Matej Mihelčić

Workflow Model saving workflow (1)

Thumb
This workflow trains and saves a model for a selected item recommendation operator.

Created: 2012-01-29 | Last updated: 2012-01-30

Credits: User Matej Mihelčić

Workflow Model testing workflow (1)

Thumb
This workflow measures performance of three models. Model learned on train data and upgraded using online model updates. Model learned on train data + all query update sets. Model learned on train data only.

Created: 2012-01-29

Credits: User Matej Mihelčić

Workflow Data iteration workflow (1)

Thumb
This is a data iteration workflow used to iterate throug query update sets.

Created: 2012-01-29

Credits: User Matej Mihelčić User Matko Bošnjak

Workflow Model update workflow (1)

Thumb
This is a Model update workflow called from data iteration workflow on every given query set. In the Loop operator model and current training set are retrieved from the repository. Model update is performed on a given query set creating new model. Model and updated train set are saved in the repository.

Created: 2012-01-29 | Last updated: 2012-01-29

Credits: User Matej Mihelčić User Matko Bošnjak

Workflow recommender workflow (RP) (1)

Thumb
This is a main online update experimentation workflow. It consists of three Execute Process operators. First operator executes model training workflow. Second operator executes online updates workflow for multiple query update sets. The last operator executes performance testing and comparison workflow. Final performance results are saved in an Excel file.

Created: 2012-01-29

Credits: User Matej Mihelčić

Workflow Model update workflow (RP) (1)

Thumb
This is a Model update workflow called from data iteration workflow on every given query set. In the Loop operator model and current training set are retrieved from the repository. Model update is performed on a given query set creating new model. Model and updated train set are saved in the repository.

Created: 2012-01-29 | Last updated: 2012-01-30

Credits: User Matej Mihelčić

Workflow Analyzing Data from a Linked Open Data SPA... (1)

Thumb
This process reads a list of countries, their GDP and energy consumption from the Eurostat Linked Open Data SPARQL Endpoint (http://wifo5-03.informatik.uni-mannheim.de/eurostat/) and analyzes whether there is a correlation between GDP and energy consumption

Created: 2013-09-11 | Last updated: 2013-09-11

Results per page:
Sort by: