Difference between revisions of "KNIME 2016"
Line 9: | Line 9: | ||
KNIME/3.6.2 | KNIME/3.6.2 | ||
== | == KNIME Module == | ||
If you want to find out more about KNIME on the HPC Cluster, you can use the command | If you want to find out more about KNIME on the HPC Cluster, you can use the command | ||
Line 23: | Line 23: | ||
Always remember: this command is case sensitive! | Always remember: this command is case sensitive! | ||
== Using KNIME on the HPC Cluster == | |||
Basically, you have two options to use KNIME on the HPC cluster: 1) you can start KNIME within a job script and execute a prepared workflow or 2) you can use the SLURM Cluster Execution from your local work station to offload selected nodes from your workflow to the cluster. Both option are described briefly below. | |||
=== Using KNIME with a job script === | |||
This approach is straight-forward once you have prepared a workflow for the execution on the cluster. That means you need to copy all the required files to a directory on the cluster (the <tt>worflowDir</tt>). After that you need to write a job script which calls KNIME and runs your workflow. A minimal example is | |||
<pre> | |||
#!/bin/bash | |||
knime -nosplash -application org.knime.product.KNIME_BATCH_APPLICATION -workflowDir="$HOME/knime-workspace/Example Workflows/Basic Examples/Simple Reporting Example" | |||
</pre> | |||
== Documentation == | == Documentation == | ||
To find out more about ''KNIME Analytics Platform'', you can take a look at this [https://www.knime.com/knime-software/knime-analytics-platform overview].<br /> | To find out more about ''KNIME Analytics Platform'', you can take a look at this [https://www.knime.com/knime-software/knime-analytics-platform overview].<br /> | ||
The full documentation and more learning material can be found [https://www.knime.com/resources here]. | The full documentation and more learning material can be found [https://www.knime.com/resources here]. |
Revision as of 14:28, 7 August 2019
Introduction
KNIME Analytics Platform is the open source software for creating data science applications and services.
KNIME stands for KoNstanz Information MinEr.
Installed version
The currently installed version is available on the environment hpc-env/6.4:
KNIME/3.6.2
KNIME Module
If you want to find out more about KNIME on the HPC Cluster, you can use the command
module spider KNIME
This will show you basic informations e.g. a short description and the currently installed version.
To load the desired version of the module, use the command, e.g.
module load KNIME
Always remember: this command is case sensitive!
Using KNIME on the HPC Cluster
Basically, you have two options to use KNIME on the HPC cluster: 1) you can start KNIME within a job script and execute a prepared workflow or 2) you can use the SLURM Cluster Execution from your local work station to offload selected nodes from your workflow to the cluster. Both option are described briefly below.
Using KNIME with a job script
This approach is straight-forward once you have prepared a workflow for the execution on the cluster. That means you need to copy all the required files to a directory on the cluster (the worflowDir). After that you need to write a job script which calls KNIME and runs your workflow. A minimal example is
#!/bin/bash knime -nosplash -application org.knime.product.KNIME_BATCH_APPLICATION -workflowDir="$HOME/knime-workspace/Example Workflows/Basic Examples/Simple Reporting Example"
Documentation
To find out more about KNIME Analytics Platform, you can take a look at this overview.
The full documentation and more learning material can be found here.