Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Be able to deploy into VPC #134

Closed
greggilbert opened this issue Apr 22, 2015 · 6 comments
Closed

Be able to deploy into VPC #134

greggilbert opened this issue Apr 22, 2015 · 6 comments
Milestone

Comments

@greggilbert
Copy link

I've got a Storm setup that I'd like to deploy within a VPC on AWS. As is pretty standard in VPC setups, there's a bastion server that you SSH into first, and then get into the instances. I can't find a way to get sparse submit to work with this setup, though. It'd be nice if the deploy script could work with this.

Sidenote: one idea I had was to lein uberjar everything, then manually scp that up to the servers. When I do that, though, and run bin/storm jar 0.0.1-STANDALONE.jar MAINCLASS, I'm not sure what to put for MAINCLASS. My project.clj is defproject events "0.0.1-SNAPSHOT", but events doesn't work. Any thoughts on that? Is the only option to run it through lein?

I think it'd make sense to update the documentation to reflect these sort of outlying scenarios.

@floer32
Copy link

floer32 commented Apr 22, 2015

sounds like it'll be frustrating to work about but 👍 very important

@dan-blanchard
Copy link
Member

The simplest thing to do:

  1. Create a jar using lein uberjar
  2. scp that jar and your topology Clojure file—I'm assuming this is called events.clj—to a server on AWS you can submit from.
  3. SSH into the same server and run:
bin/storm jar events-0.0.1-SNAPSHOT-STANDALONE.jar streamparse.commands.submit_topology events.clj

Please let us know if that doesn't work.

@dan-blanchard
Copy link
Member

Ah, I forgot one thing. You'll also need to have virtualenvs setup on all of your machines containing all of the Python prereqs you need. They will need to be located in the path indicated in your config.json file.

Once #99 is addressed, this step will no longer be necessary.

@codywilbourn
Copy link
Contributor

You could also try a bit of ssh magic. This is all hypothetical because I don't have a test setup to try it on.

Add env.use_ssh_config = True to your fabfile.py. See #54 for a example snippet.

Update your ~/.ssh/config to use the bastion host for your commands.

Host *.internal.example.com
    ProxyCommand ssh bastion.example.com exec nc %h %p

If you don't have a common subdomain you'll have to list all of the hosts individually.

Host host1.example.com
    ProxyCommand ssh bastion.example.com exec nc %h %p
...

Setup your streamparse config to use all of the hosts normally (without bastion host).

@dan-blanchard
Copy link
Member

Thanks @codywilbourn. I knew I should have just asked you. 😄

@dan-blanchard dan-blanchard added this to the v2.0 milestone Apr 27, 2015
@floer32
Copy link

floer32 commented Apr 27, 2015

Yeah +1 for SSH config as a way to manage this. We've had great success with this approach.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants