Cron and virtualenv
You should be able to do this by using the python
in your virtual environment:
/home/my/virtual/bin/python /home/my/project/manage.py command arg
EDIT: If your django project isn't in the PYTHONPATH, then you'll need to switch to the right directory:
cd /home/my/project && /home/my/virtual/bin/python ...
You can also try to log the failure from cron:
cd /home/my/project && /home/my/virtual/bin/python /home/my/project/manage.py > /tmp/cronlog.txt 2>&1
Another thing to try is to make the same change in your manage.py
script at the very top:
#!/home/my/virtual/bin/python
How to set virtualenv for a crontab?
If you're using "workon" you're actually using "virtualenv wrapper" which is another layer of abstraction that sits on top of virtualenv. virtualenv alone can be activated by cd'ing to your virtualenv root directory and running:
source bin/activate
workon is a command provided by virtualenv wrapper, not virtualenv, and it does some additional stuff that is not necessarily required for plain virtualenv. All you really need to do is source the bin/activate file in your virtualenv root directory to "activate" a virtualenv.
You can setup your crontab to invoke a bash script which does this:
#! /bin/bash
cd my/virtual/env/root/dir
source bin/activate
# virtualenv is now active, which means your PATH has been modified.
# Don't try to run python from /usr/bin/python, just run "python" and
# let the PATH figure out which version to run (based on what your
# virtualenv has configured).
python myScript.py
Cron activate virtualenv and run multiple python scripts from shell script
Remember that &
means to run the entire previous command asynchronously. This includes anything before a &&
. Commands that run asynchronously run in separate processes.
To take a simplified example of your problem, let's say we asynchronously change directories, run pwd
, and asynchronously run pwd
again.
#!/bin/sh
cd / && \
pwd \
& pwd
On my computer, this outputs:
/home/nick
/
The cd /
was meant to affect both pwd
calls, but it only affected the first one, because the second one runs in a different process. (They also printed out of order in this case, the second one first.)
So, how can you write this script in a more robust fashion?
First, I would turn on strict error handling with -e
. This exits as soon as any (non-asynchronous) command returns a non-zero exit code. Second, I would avoid the use of &&
, because strict error handling deals with this. Third, I would use wait
at the end to make sure the script doesn't exit until all of the sub-scripts have exited.
#!/bin/sh
set -e
cd /
pwd &
pwd &
wait
The general idea is that you turn on strict error handling, do all of your setup in a synchronous fashion, then launch your four scripts asynchronously, and wait for all to finish.
To apply this to your program:
#!/bin/sh
set -e
cd /home/ubuntu/virtualenvironment/scripts
source /home/ubuntu/venv3.8/bin/activate
python script1.py &
python script2.py &
python script3.py &
python script4.py &
wait
Related Topics
What's a Correct and Good Way to Implement _Hash_()
How to Get Current Available Gpus in Tensorflow
Pandas Timeseries Plot Setting X-Axis Major and Minor Ticks and Labels
How to Make a Custom Activation Function with Only Python in Tensorflow
How to Use Jdbc Source to Write and Read Data in (Py)Spark
Why am I Getting a Filenotfounderror
Environment Variable Differences When Using Paramiko
Multiple Python Versions on the Same MAChine
How to Bypass the Google Captcha with Selenium and Python
Reading File Using Relative Path in Python Project
How to Pretty Print Nested Dictionaries
Python Requests and Persistent Sessions
Detecting Consecutive Integers in a List
Removing Duplicate Characters from a String
Python MySQLdb Typeerror: Not All Arguments Converted During String Formatting