Search results for "rapidminer"

Search filter terms
Filter by category
Filter by type
Filter by tag
Filter by user
Filter by licence
Filter by group
Results per page:
Sort by:
Showing 341 results. Use the filters on the left and the search box below to refine the results.
Creator

Pack Who Wants to be a Data Miner?


Created: 2011-11-02 17:54:07 | Last updated: 2013-09-09 16:22:11

One of the most fun events at the annual RapidMiner Community Meeting and Conference (RCOMM) is the live data mining process design competition "Who Wants to be a Data Miner?". In this competition, participants must design RapidMiner processes for a given goal within a few minutes. The tasks are related to data mining and data analysis, but are rather uncommon. In fact, most of the challenges ask for things RapidMiner was never supposed to do. This pack contains solutions for these...

12 items in this pack

Comments: 0 | Viewed: 260 times | Downloaded: 142 times

Tags:

Workflow Example process of the Text and Web Mining... (1)

Thumb
This example process crawls the web (RapidMiner forum) for entries, extracts the information with the Process Documents operator and applies Clustering on the results. The process shows the interaction between the Web Mining Extension and the Text Mining Extension from RapidMiner.

Created: 2014-12-04

Uploader

Workflow SVD user-based collaborative filtering rec... (1)

Thumb
This workflow takes user-item matrix A as a input. Then it calculates reduced SVD decomposition A_k by taking only k greatest singular values and corresponding singular vectors. This worfkflow calculates recommendations and predictions for particular user %{id} from matrix A. Particular row %{id} is taken from original matrix A and replaced with %{id} row in A_k matrix. Predictions are made for %{id} user based on another users A_k. Note: This workflow uses R-script operator with R library ...

Created: 2011-05-09 | Last updated: 2011-05-09

Credits: User Ninoaf User Matko Bošnjak

Attributions: Workflow User-based collaborative filtering recommender system template Blob Datasets for the pack: RCOMM2011 recommender systems workflow templates

Workflow Example Process using the RapidMiner Linke... (1)

Thumb
This example process uses the operator SPARQL Data Importer provided by the RapidMiner LOD Extension to receive additional data about books including the author, isbn, country, abstract, number of pages and language from dbpedia (www.dbpedia.org).

Created: 2014-12-04

Workflow Using Graph Kernels for Feature Generation... (1)

Thumb
This example shows how to use graph kernels for feature generation.In this example we use the Root RDF Walk Count Kernel, and the Fast RDF WL Sub Tree Kernel.The input data for the process can be found here.More information about the process can be found here.

Created: 2015-05-04 | Last updated: 2015-05-04

Workflow Finding wrong links in Linked Open Data wi... (1)

Thumb
This simple workflow shows how to detect wrong links in Linked Open Data. It uses operators from the Linked Open Data and the Anomaly Detection extensions. The process first reads a list of links from the EventMedia endpoint, linking to DBpedia, then creates feature vectors for each of those links. Finally, an outlier detection operator is employed to find suspicious links.

Created: 2014-03-14

Uploader

Workflow Semantic meta-mining workflow that perform... (1)

Thumb
Performs a crossvalidation on a data set composed of meta-data of baseline RapidMiner workflows expressed in RDF with the DMOP's ontology terminology for representing processes. Includes discovery of a set of semantic features (patterns) by the Fr-ONT-Qu algorithm (‘workflow patterns’). Through propositionalisation approach those features may be used in an arbitrary (propositional) RapidMiner classification operator.

Created: 2012-03-05 | Last updated: 2012-03-05

Uploader

Blob RapidMiner Workspace with Processes and Results from...

Created: 2011-07-14 13:21:27 | Last updated: 2011-07-14 13:21:28

Credits: User http://viswanathgs.myopenid.com/

License: Creative Commons Attribution-Share Alike 3.0 Unported License

Operators used in experiments: Fast Correlation Based Filter (FCBF) Shrunken Centroids - Prediction Analysis for Microarrays (PAM) Backward Elimination via Hilbert-Schmidt Independence Criterion (BAHSIC) Dense Relevant Attribute Group Selector (DRAGS) Consesus Group Stable Feature Selector (CGS)   Datasets: Colon, SRBCT, SONAR

File type: ZIP archive

Comments: 0 | Viewed: 102 times | Downloaded: 64 times

This File has no tags!

Workflow 1. Getting Started: Learn and Store a Model (1)

Thumb
This getting started process shows the first step of learning and storing a model. After a model is learned, you can load (Retrieve operator) the model and apply it to a test data set (see 2. Getting Started: Retrieve and Apply Model). The process is NOT concerned with evaluation of the model. This process will not immediately run in RapidMiner because you have to adjust the repository path in the Retrieve operator. Tags: Rapidminer, model, learn, learning, training, train, store, first step

Created: 2011-01-17 | Last updated: 2011-01-17

Workflow Hybrid Recommender System Using Linked Op... (1)

Thumb
This process is using the Rapid Miner Linked Open Data extension and the Recommender extension, to build a hybrid Linked Open Data enabled recommender system for books.The input data for the process can be found here.More information about the process can be found here.

Created: 2014-05-15 | Last updated: 2014-05-15

Results per page:
Sort by: