You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jun 26, 2020. It is now read-only.
we face similar requirements and wonder if you were able to resolve yours.
If so, could you please point us towards the direction or steps that you chose to configure the Hive container to use the hdfs running within this Docker image?
Hi, I have same problem for using volume for my hdfs input/output.
I want to make directory by $HADOOP_PREFIX/bin/hadoop fs -mkdir mytest and then put files on mytest/input and do something on them like wordcount and I want to persist input and output data after each docker run!
How is it possible?
I made so far:
Added these codes to hdfs-site.xml: <property> <name>dfs.datanode.data.dir</name> <value>file:///home/app/hdfs/datanode</value> <description>DataNode directory</description> </property>
Create a docker volume with name 'myvol'
Use -v for run image:
docker run -v myvol:/home/app -it c29b621ba74a /etc/bootstrap.sh -bash
But in /home/app directory there are just my created files by vi command and another folder named 'hdfs', that not working to persist input/output data.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I'm new to the Hadoop stack so forgive me if I'm missing something obvious.
I had two requirements I'm trying to work out with this Docker image.
Can anyone help?
The text was updated successfully, but these errors were encountered: