Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No service type 'SPARK' available for cluster with version 'CDH 6.3.0' #81

Open
bvkudupa opened this issue Dec 4, 2019 · 4 comments
Open

Comments

@bvkudupa
Copy link

bvkudupa commented Dec 4, 2019

Hi,
I am able to install HDFS, YARN and Zookeeper. But facing issue while installing spark.
when I run
python deploycloudera.py -ihivepass -fCloudera@123 -nCloudera@123 -rCloudera@123

Traceback (most recent call last):
File "deploycloudera.py", line 1079, in
main()
File "deploycloudera.py", line 1028, in main
spark_service = deploy_spark(CLUSTER, SPARK_SERVICE_NAME, SPARK_SERVICE_CONFIG, SPARK_MASTER_HOST, SPARK_MASTER_CONFIG, SPARK_WORKER_HOSTS, SPARK_WORKER_CONFIG, SPARK_GW_HOSTS, SPARK_GW_CONFIG)
File "deploycloudera.py", line 689, in deploy_spark
spark_service = cluster.create_service(spark_service_name, "SPARK")
File "/usr/lib/python2.7/site-packages/cm_api/endpoints/clusters.py", line 161, in create_service
service_type, self.name)
File "/usr/lib/python2.7/site-packages/cm_api/endpoints/services.py", line 44, in create_service
ApiService, True, data=[apiservice])[0]
File "/usr/lib/python2.7/site-packages/cm_api/endpoints/types.py", line 137, in call
ret = method(path, data=data, params=params)
File "/usr/lib/python2.7/site-packages/cm_api/resource.py", line 148, in post
self._make_headers(contenttype))
File "/usr/lib/python2.7/site-packages/cm_api/resource.py", line 73, in invoke
headers=headers)
File "/usr/lib/python2.7/site-packages/cm_api/http_client.py", line 193, in execute
raise self._exc_class(ex)
cm_api.api_client.ApiException: No service type 'SPARK' available for cluster with version 'CDH 6.3.0'. (error 400).

What is the service name for spark in CDH 6.3.0?

@funes79
Copy link

funes79 commented Dec 4, 2019

SPARK2 or SPARK2_ON_YARN

@bvkudupa
Copy link
Author

bvkudupa commented Dec 4, 2019

with SPARK2 I get the same exception as No service type 'SPARK2' available for cluster with version 'CDH 6.3.0'. (error 400)
When I use SPARK2_ON_YARN, I get below error.
cm_api.api_client.ApiException: Unknown configuration attribute 'hdfs_service' for service (type: 'SPARK2_ON_YARN', name: 'SPARK2_ON_YARN'). (error 400)

Please find below relevant code snippet:
`### Spark ###
SPARK_SERVICE_NAME = "SPARK2_ON_YARN"
SPARK_SERVICE_CONFIG = {
'hdfs_service': HDFS_SERVICE_NAME,
}
SPARK_MASTER_HOST = SPARK_HOST
SPARK_MASTER_CONFIG = {
'master_max_heapsize': 67108864,
}
SPARK_WORKER_HOSTS = list(CLUSTER_HOSTS)
SPARK_WORKER_CONFIG = {
'executor_total_max_heapsize': 67108864,
'worker_max_heapsize': 67108864,
}
SPARK_GW_HOSTS = list(CLUSTER_HOSTS)
SPARK_GW_CONFIG = { }

Deploys spark - master, workers, gateways

def deploy_spark(cluster, spark_service_name, spark_service_config, spark_master_host, spark_master_config, spark_worker_hosts, spark_worker_config, spark_gw_hosts, spark_gw_config):
spark_service = cluster.create_service(spark_service_name, "SPARK2_ON_YARN")
spark_service.update_config(spark_service_config)

sm = spark_service.get_role_config_group("{0}-SPARK_MASTER-BASE".format(spark_service_name))
sm.update_config(spark_master_config)
spark_service.create_role("{0}-sm".format(spark_service_name), "SPARK_MASTER", spark_master_host)

sw = spark_service.get_role_config_group("{0}-SPARK_WORKER-BASE".format(spark_service_name))
sw.update_config(spark_worker_config)

worker = 0
for host in spark_worker_hosts:
worker += 1
spark_service.create_role("{0}-sw-".format(spark_service_name) + str(worker), "SPARK_WORKER", host)

gw = spark_service.get_role_config_group("{0}-GATEWAY-BASE".format(spark_service_name))
gw.update_config(spark_gw_config)

gateway = 0
for host in spark_gw_hosts:
gateway += 1
spark_service.create_role("{0}-gw-".format(spark_service_name) + str(gateway), "GATEWAY", host)

return spark_service`

Please let me know if any other places to change. Thanks :)

@smareti
Copy link

smareti commented Mar 11, 2020

I am also getting same kind of issue..below. I am trying to install on CDH 5.14.4

Traceback (most recent call last):
File "cm-api-deploy.py", line 1001, in
deploy.gen_service_types_and_objects()
File "cm-api-deploy.py", line 339, in gen_service_types_and_objects
self.service_types_and_names[svc_type], svc_type) # Service name & type
File "/usr/lib/python2.6/site-packages/cm_api/endpoints/clusters.py", line 161, in create_service
service_type, self.name)
File "/usr/lib/python2.6/site-packages/cm_api/endpoints/services.py", line 44, in create_service
ApiService, True, data=[apiservice])[0]
File "/usr/lib/python2.6/site-packages/cm_api/endpoints/types.py", line 137, in call
ret = method(path, data=data, params=params)
File "/usr/lib/python2.6/site-packages/cm_api/resource.py", line 148, in post
self._make_headers(contenttype))
File "/usr/lib/python2.6/site-packages/cm_api/resource.py", line 73, in invoke
headers=headers)
File "/usr/lib/python2.6/site-packages/cm_api/http_client.py", line 193, in execute
raise self._exc_class(ex)
cm_api.api_client.ApiException: No service type 'SPARK2_ON_YARN' available for cluster with version 'CDH 5.0.0'. (error 400)

@smareti
Copy link

smareti commented Mar 11, 2020

I have installed another parcel to get install spark 2 service. here is the example parcel: SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957-el6.parcel

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants