This cookbook fetches a hadoop distribution and installs it as a local user. Some very basic configuration is done.
You should have java
installed and the cookbook assumes a zsh
on the
system for the user setup.
Key | Type | Description | Default |
---|---|---|---|
['hadoop']['version'] | String | The hadoop version. | 2.4.0 |
['hadoop']['data-directory'] | String | Which directory is the root directory for the hadoop data. | /hadoop |
['hadoop']['user']['name'] | String | Username for the hadoop user. | hadoop |
['hadoop']['user']['shell'] | String | The shell for the hadoop user. | Platform dependent /usr/local/bin/zsh or /usr/bin/zsh. |
['hadoop']['core-site'] | Hash | Settings for the core-site.xml. |
|
Just include fetch-hadoop
in your node's run_list
:
{
"name":"my_node",
"run_list": [
"recipe[fetch-hadoop]"
]
}
You can include the recipe fetch-hadoop::format-namenode
to format the
name node directly after the installation.
If the DFS service is running you can use
fetch-hadoop::dfs-create-user-dir
to create the following directories:
/user
/user/#{node['hadoop']['user']['name']}
- Fork the repository on Github
- Create a named feature branch (like
add_component_x
) - Write your change
- Write tests for your change (if applicable)
- Run the tests, ensuring they all pass
- Submit a Pull Request using Github
- Freely distributable and licensed under the Apache 2.0 license.
- Copyright 2014 Wegtam UG