I wrote a real-time web app that consists of the following:
Rails to serve the web pages (listens on port 80) Nodejs to handle real-time logic (listens to port 8888) So on a particular page served by my rails app, the JS will use socket.io to establish a connection to my nodejs instance to allow real time http push.
I will have to deploy separate apps for different servers (Rails and Node js). Is this a good solution (having multiple apps on Heroku)? Will this affect performance and will there be any extra costs?
I do the same with a Python and node apps. The two downsides I can see, which are not really big cons so far as I am currently concerned, are:
If you want to prevent dyno idling you have to have at least 2 dynos per app, which means a little more cost. You can circumvent that by "pinging" (on the http level) each of your apps at least once an hour. Say from pingdom, or any other service that offers that for free. One app could even ping the other.
The communication between your apps isn't "as direct" as you may want it to be. That is: it is not thru a local interface. I don't share state between my apps (no Redis, no DB, no amq), and the communication between them is RESTful. So my Python app has to go thru the Heroku http router to get to my node app (that is: it has to go to
my-node-app.herokuapp.com). This adds just a little bit of latency (which I don't really care about), and just a little bit of complication to my code: my Python and node apps now have a share secret in order to authenticate my Python app to my node app. This, to prevent "the general public" from contacting my node app.
Overall, as I said, I think that's a fairly decent solution.
There is currently no discussion for this recipe.
This recipe can be found in it's original form on Stack Over Flow.