How to Share Worker Among Two Different Applications on Heroku

How to share worker among two different applications on heroku?

If both apps are using the same Redis URL and same namespace, you can spin up one worker with that same Redis config and it will be shared by both.

Note that your Sidekiq process will boot one app or the other. The code for your Workers must be in that app. The other app won't be able to reference the code but can push jobs using:

Sidekiq::Client.push('class' => 'SomeWorker', 'args' => [1,2,3])

Note that 'class' is a String so SomeWorker can actually be defined in the other app.

multiple worker/web processes on a single heroku app

All processes must have unique names. Additionally, the names web and worker are insignificant and carry no special meaning. The only process that carries a significant name is the web process, as stated in the Heroku docs:

The web process type is special as it’s the only process type that
will receive HTTP traffic from Heroku’s routers. Other process types
can be named arbitrarily. -- (https://devcenter.heroku.com/articles/procfile)

So you want a Procfile like so:

capture: node capture.js
process: node process.js
purge: node purge.js
api: node api.js
web: node web.js

You can then scale each process separately:

$ heroku ps:scale purge=4

How to generate multi workers and bind them to non-web Java applications on Heroku

Yes, you can have as many worker processes as you'd like. They just need to have different names in your Procfile. In the article you cited, the single worker is just called worker, but your Procfile could look something like this with two different workers called updater and mailer:

web: java -jar web-app.jar $PORT
updater: sh worker/target/bin/updater
mailer: sh worker/target/bin/mailer

If you're using the appassembler-maven-plugin shown in the article, you'll also need to add another <program>...</program> element for each of your workers so the nessisary start scripts are generated.

Share database between 2 apps in Heroku

UPDATED

Originally, this answer stated that although this was possible with a few tricks, it was strongly discouraged. This was based on advice on the Heroku developer support website. However, recently Heroku issued a communication specifically describing how to achieve this, and have watered down their advice on the developer site. The complete text of this section of their e-mail is included below:

Did you know that Heroku apps can
share a common database? For example, you can put analytics functions
in a separate application from your user-facing code.

Simply set the DATABASE_URL config var for several apps to the same
value. First, get the DATABASE_URL for your existing app:

$ heroku config | grep DATABASE_URL  --app sushi DATABASE_URL => postgres://lswlmfdsfos:5FSLVUSLLT123@ec2-123-456-78-90.compute1.amazonaws.com/ldfoiusfsf

Then, set the DATABASE_URL for new apps to this value:

$ heroku config:add DATABASE_URL=postgres://lswlmfdsfos:5FSLVUSLLT123@ec2-123-456-78-90.compute-1.amazonaws.com/ldfoiusfsf --app sushi-analytics
Adding config vars: DATABASE_URL => postgres://lswlm...m/ldfoiusfsf Restarting app... done, v74. That's it

— now both apps will share one database.

Just as a point of reference, Heroku's original advice was to create and use an API to access data remotely. My personal view is that overall, for many situations this is good advice, (i.e. better than just connecting multiple application to the same DB) although I can see situations where that'd be more trouble than it's worth.

UPDATE

As per comments on this answer, it's worth noting that Heroku do reserve the right to change database URLs as required. If this occurs, it will cause your secondary connections to fail, and you'll need to update the URLs accordingly.

How to configure two processes in a heroku dyno to communicate with each other

The Procfile defines the process types. They will be run in separate containers.

If you want to run multiple processes in one dyno, your Procfile should have only one entry and one processes that will have your services are children.

There are several methods and buildpacks to achieve this.

Deploy two web apps from one repo (heroku)

Best solution or architecture is to have one git repository for each deployable application.

Several apps in the same source code repository is used in legacy frameworks , deployed in the old way in which, manually tasks could handle this kind of requeriments.

If somebody says: But I have code wich is required by all my bots. In this case the most optimum approach is to use libraries like java with maven or nodejs with npm.

Anyway, your requirement is interesting

Solution 1

If you have several entrypoints , one by bot like: bot1.py bot2.py bot3.py you could environment variables to parametrize the file name in Procfile

worker: python $BOT_NAME.py

Solution 2

Ensure to have just one entrypoint like start_bot.py in which an environment variable is used to determine the bot has to be started.

Shared message queues between apps with Heroku

I've been having a lot of success using AMQP on Heroku. There's a heroku plugin available (CloudAMQP) and good ruby support for AMQP.

In terms of provisioning services like this on Heroku in general, I've been provisioning the plugin in the main Heroku application, and sharing the environment config with other applications.

Is it possible to get worker number on heroku with multiple web concurrency?

If all you need is the index of the current dyno, then use the DYNO environment variable.

If you need to also know the quantity of dynos configured for your app, then use the Platform API to retrieve the formation.
The URL you would need to query is:
https://api.heroku.com/apps/your_app_name/formation/web
The field you would need is "quantity".

Note that you would need to pass an authentication token in the authorization header.



Related Topics



Leave a reply



Submit