Is 'Eval' Supposed to Be Nasty

Is 'eval' supposed to be nasty?

If you are evaling a string submitted by, or modifiable by the user, this is tantamount to allowing arbitrary code execution. Imagine if the string contained an OS call to rm -rf / or similar. That said, in situations where you know the strings are appropriately constrained, or your Ruby interpreter is sandboxed appropriately, or ideally both, eval can be extraordinarily powerful.

The problem is analogous to SQL injection, if you're familiar. The solution here is similar to the solution to the injection problem (parameterized queries). That is, if the statements you would like to eval are known to be of a very specific form, and not all of the statement need be submitted by the user, only a few variables, a math expression, or similar, you can take in these small pieces from the user, sanitize them if necessary, then evaluate the safe template statement with the user input plugged in in the appropriate places.

Why is using the JavaScript eval function a bad idea?

  1. Improper use of eval opens up your
    code for injection attacks

  2. Debugging can be more challenging
    (no line numbers, etc.)

  3. eval'd code executes slower (no opportunity to compile/cache eval'd code)

Edit: As @Jeff Walden points out in comments, #3 is less true today than it was in 2008. However, while some caching of compiled scripts may happen this will only be limited to scripts that are eval'd repeated with no modification. A more likely scenario is that you are eval'ing scripts that have undergone slight modification each time and as such could not be cached. Let's just say that SOME eval'd code executes more slowly.

When is `eval` in Ruby justified?

The only case I know of (other than "I have this string and I want to execute it") is dynamically dealing with local and global variables. Ruby has methods to get the names of local and global variables, but it lacks methods to get or set their values based on these names. The only way to do AFAIK is with eval.

Any other use is almost certainly wrong. I'm no guru and can't state categorically that there are no others, but every other use case I've ever seen where somebody said "You need eval for this," I've found a solution that didn't.

Note that I'm talking about string eval here, by the way. Ruby also has instance_eval, which can take either a string or a block to execute in the context of the receiver. The block form of this method is fast, safe and very useful.

Why should exec() and eval() be avoided?

There are often clearer, more direct ways to get the same effect. If you build a complex string and pass it to exec, the code is difficult to follow, and difficult to test.

Example: I wrote code that read in string keys and values and set corresponding fields in an object. It looked like this:

for key, val in values:
fieldName = valueToFieldName[key]
fieldType = fieldNameToType[fieldName]
if fieldType is int:
s = 'object.%s = int(%s)' % (fieldName, fieldType)
#Many clauses like this...

exec(s)

That code isn't too terrible for simple cases, but as new types cropped up it got more and more complex. When there were bugs they always triggered on the call to exec, so stack traces didn't help me find them. Eventually I switched to a slightly longer, less clever version that set each field explicitly.

The first rule of code clarity is that each line of your code should be easy to understand by looking only at the lines near it. This is why goto and global variables are discouraged. exec and eval make it easy to break this rule badly.

Why should eval be avoided in Bash, and what should I use instead?

There's more to this problem than meets the eye. We'll start with the obvious: eval has the potential to execute "dirty" data. Dirty data is any data that has not been rewritten as safe-for-use-in-situation-XYZ; in our case, it's any string that has not been formatted so as to be safe for evaluation.

Sanitizing data appears easy at first glance. Assuming we're throwing around a list of options, bash already provides a great way to sanitize individual elements, and another way to sanitize the entire array as a single string:

function println
{
# Send each element as a separate argument, starting with the second element.
# Arguments to printf:
# 1 -> "$1\n"
# 2 -> "$2"
# 3 -> "$3"
# 4 -> "$4"
# etc.

printf "$1\n" "${@:2}"
}

function error
{
# Send the first element as one argument, and the rest of the elements as a combined argument.
# Arguments to println:
# 1 -> '\e[31mError (%d): %s\e[m'
# 2 -> "$1"
# 3 -> "${*:2}"

println '\e[31mError (%d): %s\e[m' "$1" "${*:2}"
exit "$1"
}

# This...
error 1234 Something went wrong.
# And this...
error 1234 'Something went wrong.'
# Result in the same output (as long as $IFS has not been modified).

Now say we want to add an option to redirect output as an argument to println. We could, of course, just redirect the output of println on each call, but for the sake of example, we're not going to do that. We'll need to use eval, since variables can't be used to redirect output.

function println
{
eval printf "$2\n" "${@:3}" $1
}

function error
{
println '>&2' '\e[31mError (%d): %s\e[m' "$1" "${*:2}"
exit $1
}

error 1234 Something went wrong.

Looks good, right? Problem is, eval parses twice the command line (in any shell). On the first pass of parsing one layer of quoting is removed. With quotes removed, some variable content gets executed.

We can fix this by letting the variable expansion take place within the eval. All we have to do is single-quote everything, leaving the double-quotes where they are. One exception: we have to expand the redirection prior to eval, so that has to stay outside of the quotes:

function println
{
eval 'printf "$2\n" "${@:3}"' $1
}

function error
{
println '&2' '\e[31mError (%d): %s\e[m' "$1" "${*:2}"
exit $1
}

error 1234 Something went wrong.

This should work. It's also safe as long as $1 in println is never dirty.

Now hold on just a moment: I use that same unquoted syntax that we used originally with sudo all of the time! Why does it work there, and not here? Why did we have to single-quote everything? sudo is a bit more modern: it knows to enclose in quotes each argument that it receives, though that is an over-simplification. eval simply concatenates everything.

Unfortunately, there is no drop-in replacement for eval that treats arguments like sudo does, as eval is a shell built-in; this is important, as it takes on the environment and scope of the surrounding code when it executes, rather than creating a new stack and scope like a function does.

eval Alternatives

Specific use cases often have viable alternatives to eval. Here's a handy list. command represents what you would normally send to eval; substitute in whatever you please.

No-op

A simple colon is a no-op in bash:

:

Create a sub-shell

( command )   # Standard notation

Execute output of a command

Never rely on an external command. You should always be in control of the return value. Put these on their own lines:

$(command)   # Preferred
`command` # Old: should be avoided, and often considered deprecated

# Nesting:
$(command1 "$(command2)")
`command "\`command\`"` # Careful: \ only escapes $ and \ with old style, and
# special case \` results in nesting.

Redirection based on variable

In calling code, map &3 (or anything higher than &2) to your target:

exec 3<&0         # Redirect from stdin
exec 3>&1 # Redirect to stdout
exec 3>&2 # Redirect to stderr
exec 3> /dev/null # Don't save output anywhere
exec 3> file.txt # Redirect to file
exec 3> "$var" # Redirect to file stored in $var--only works for files!
exec 3<&0 4>&1 # Input and output!

If it were a one-time call, you wouldn't have to redirect the entire shell:

func arg1 arg2 3>&2

Within the function being called, redirect to &3:

command <&3       # Redirect stdin
command >&3 # Redirect stdout
command 2>&3 # Redirect stderr
command &>&3 # Redirect stdout and stderr
command 2>&1 >&3 # idem, but for older bash versions
command >&3 2>&1 # Redirect stdout to &3, and stderr to stdout: order matters
command <&3 >&4 # Input and output!

Variable indirection

Scenario:

VAR='1 2 3'
REF=VAR

Bad:

eval "echo \"\$$REF\""

Why? If REF contains a double quote, this will break and open the code to exploits. It's possible to sanitize REF, but it's a waste of time when you have this:

echo "${!REF}"

That's right, bash has variable indirection built-in as of version 2. It gets a bit trickier than eval if you want to do something more complex:

# Add to scenario:
VAR_2='4 5 6'

# We could use:
local ref="${REF}_2"
echo "${!ref}"

# Versus the bash < 2 method, which might be simpler to those accustomed to eval:
eval "echo \"\$${REF}_2\""

Regardless, the new method is more intuitive, though it might not seem that way to experienced programmed who are used to eval.

Associative arrays

Associative arrays are implemented intrinsically in bash 4. One caveat: they must be created using declare.

declare -A VAR   # Local
declare -gA VAR # Global

# Use spaces between parentheses and contents; I've heard reports of subtle bugs
# on some versions when they are omitted having to do with spaces in keys.
declare -A VAR=( ['']='a' [0]='1' ['duck']='quack' )

VAR+=( ['alpha']='beta' [2]=3 ) # Combine arrays

VAR['cow']='moo' # Set a single element
unset VAR['cow'] # Unset a single element

unset VAR # Unset an entire array
unset VAR[@] # Unset an entire array
unset VAR[*] # Unset each element with a key corresponding to a file in the
# current directory; if * doesn't expand, unset the entire array

local KEYS=( "${!VAR[@]}" ) # Get all of the keys in VAR

In older versions of bash, you can use variable indirection:

VAR=( )  # This will store our keys.

# Store a value with a simple key.
# You will need to declare it in a global scope to make it global prior to bash 4.
# In bash 4, use the -g option.
declare "VAR_$key"="$value"
VAR+="$key"
# Or, if your version is lacking +=
VAR=( "$VAR[@]" "$key" )

# Recover a simple value.
local var_key="VAR_$key" # The name of the variable that holds the value
local var_value="${!var_key}" # The actual value--requires bash 2
# For < bash 2, eval is required for this method. Safe as long as $key is not dirty.
local var_value="`eval echo -n \"\$$var_value\""

# If you don't need to enumerate the indices quickly, and you're on bash 2+, this
# can be cut down to one line per operation:
declare "VAR_$key"="$value" # Store
echo "`var_key="VAR_$key" echo -n "${!var_key}"`" # Retrieve

# If you're using more complex values, you'll need to hash your keys:
function mkkey
{
local key="`mkpasswd -5R0 "$1" 00000000`"
echo -n "${key##*$}"
}

local var_key="VAR_`mkkey "$key"`"
# ...

Using eval function in Ruby to call other functions

In this case you may safely use send instead of eval, like in this example:

def processQuestion(question)
return send("get#{question.command}", question)
end

Just be aware that send may be as dangerous as eval if you do not sanitize your input (question.command in this case).

If possible, do a white-list filtering before calling send (or eval), otherwise someone could pass a command which does something you do not want to do.

eval certain regex from file to replace chars in string

While you can write something to parse that file, it rapidly gets complicated because you have to parse regular expressions. Consider /\/foo\\/.

There are a number of incomplete solutions. You can split on whitespace, but this will fail on /foo bar/.

re, replace = line.split(/\s+/, 2)

You can use a regex. Here's a first stab.

match = "/3/ 4".match(%r{^/(.*)/\s+(.+)})

This fails on escaped /, we need something more complex.

match = '/3\// 4'.match(%r{\A / ((?:[^/]|\\/)*) / \s+ (.+)}x)

I'm going to guess it was not your teacher's intent to have you parsing regexes. For the purposes of the assignment, splitting on whitespace is probably fine. You should clarify with your teacher.


This is a poor data format. It is non-standard, difficult to parse, and has limitations on the replacement. Even a tab-delimited file would be better.

There's little reason to use a non-standard format these days. The simplest thing is to use a standard data format for the file. YAML or JSON are the most obvious choices. For such simple data, I'd suggest JSON.

[
{ "re": "e", "replace": "3" },
{ "re": "l", "replace": "1" }
]

Parsing the file is trivial, use the built-in JSON library.

require 'json'
specs = JSON.load("test.json")

And then you can use them as a list of hashes.

specs.each do |spec|
# No eval necessary.
re = Regexp.new(spec["re"])

# `gsub!` replaces in place
result.gsub!(re, spec["replace"])
end

The data file is extensible. For example, if later you want to add regex options.

[
{ "re": "e", "replace": "3" },
{ "re": "l", "replace": "1", "options": ['IGNORECASE'] }
]

While the teacher may have specified a poor format, pushing back on bad requirements is good practice for being a developer.

convert string to ActiveSupport::Duration

Ohhooo I got answer on my own
Need to add eval like this

Date.today + eval('1.month')

works perfectly fine

Is there any another way? what are pros and cons to use this way?

This could lead to a remote code execution exploit, e.g. random people doing whatever they want on your servers



Related Topics



Leave a reply



Submit