Rails 3 + Daemons Gem: Exception When Querying Model

Rails 3 + Daemons gem: Exception when querying model

There're some intricacies with regard to file descriptors when the process is forked/spawned/whatever-it-is-called-on-windows.

Try to reinstantiate a logger after you do Daemons.run_proc('aeon_server') with Rails.logger = ActiveSupport::BufferedLogger.new('/path/to/log')

Rails 3.1 + Daemons gem won't let me access my database

To properly preload Rails environment (dependencies) for the script, start it as follows:

bundle exec my_script_ctl start

How to create a que from daemon in rails app whithout getting IOError: closed stream

This problem is related to this question. More on this you can read here. So, in my case I've just reassign rails' logger and qu's loggers to the same file in the beginning of daemon's cycle and close it in the end, and everything is working out just fine:

client.track(*hashtags) do |status|
parser_logger = ActiveSupport::BufferedLogger.new( File.join(Rails.root, "log", "qu.log"))
Rails.logger = parser_logger
Qu.configure do |c|
c.connection = Mongo::Connection.new.db("appname_qu")
c.logger = parser_logger
end
job = Qu.enqueue TweetProcessor, status
Rails.logger.close
end

Stripe Exception double rendering using stripe gem

The retry logic is supposed to be for when your application doesn't know Stripe's response, primarily during network issues likes timeouts. In this case your server received a response from stripe and one of two things could be happening. Either you're retring the same event that is currently in progress /or/ the event in progress is actually different than the one you're trying but given some issue in your application stack you actually chose the same indempotency token for two api requests.

For further details please read this link :- https://stripe.com/docs/api/idempotent_requests

Encoding error with Rails 2.3 on Ruby 1.9.3

I finally figured out what my issue was. While my databases were encoded with utf8, the app with the original mysql gem was injecting latin1 text into the utf8 tables.

What threw me off was that the output from the mysql comand line client looked correct. It is important to verify that your terminal, the database fields and the MySQL client are all running in utf8.

MySQL's client runs in latin1 by default. You can discover what it is running in by issuing this query:

show variables like 'char%';

If setup properly for utf8 you should see:

+--------------------------+----------------------------+
| Variable_name | Value |
+--------------------------+----------------------------+
| character_set_client | utf8 |
| character_set_connection | utf8 |
| character_set_database | utf8 |
| character_set_filesystem | binary |
| character_set_results | utf8 |
| character_set_server | utf8 |
| character_set_system | utf8 |
| character_sets_dir | /usr/share/mysql/charsets/ |
+--------------------------+----------------------------+

If these don't look correct, make sure the following is set in the [client] section of your my.cnf config file:

default-character-set = utf8

Add add the following to the [mysqld] section:

# use utf8 by default
character-set-server=utf8
collation-server=utf8_general_ci

Make sure to restart the mysql daemon before relaunching the client and then verify.

NOTE: This doesn't change the charset or collation of existing databases, just ensures that any new databases created will default into utf8 and that the client will display in utf8.

After I did this I saw characters in the mysql client that matched what I was getting from the mysql2 gem. I was also able to verify that this content was latin1 by switching to "encoding: latin1" temporarily in my database.conf.

One extremely handy query to find issues is using char length to find the rows with multi-byte characters:

SELECT id, name FROM items WHERE LENGTH(name) != CHAR_LENGTH(name);

There are a lot of scripts out there to convert latin1 contents to utf8, but what worked best for me was dumping all of the databases as latin1 and stuffing the contents back in as utf8:

mysqldump -u root -p --opt --default-character-set=latin1 --skip-set-charset  DBNAME > DBNAME.sql

mysql -u root -p --default-character-set=utf8 DBNAME < DBNAME.sql

I backed up my primary db first, then dumped into a test database and verified like crazy before rolling over to the corrected DB.

My understanding is that MySQL's translation can leave some things to be desired with certain more complex characters but since most of my multibyte chars are fairly common things (accent marks, quotes, etc), this worked great for me.

Some resources that proved invaluable in sorting all of this out:

  • Derek Sivers guide on transforming MySQL data latin1 in utf8 -> utf8
  • Blue Box article on MySQL character set hell
  • Simple table conversion instructions on Stack Overlow

MongoMapper, MongoDB and EventMachine

It is tough to say what is really happening here, but I have a suspicion that I'd like you to investigate please.

Ensure you are establishing your Mongo connection in a POST-daemonize config, instead of a PRE-daemonize config. DaemonKit forks after the PRE-initializers and the mongo connection will need to be re-established at this point in time. This goes for any IO of any kind, after forking they all need to be re-opened.

The same logic holds true for ActiveRecord in a daemonized project. If this was ActiveRecord you could do something like this:

config/pre-daemonize/database.rb

ActiveRecord::Base.establish_connection(:foo => 'bar')

config/post-daemonize/database.rb

ActiveRecord::Base.verify_active_connections!

I'm not a mongomapper user, so I don't know how to translate the above example.

My best advise is to "ping" mongo in PRE-daemonization phase and error out if it failed, so the daemon doesn't even start, and the properly setup the connection in a post-daemonize configuration.



Related Topics



Leave a reply



Submit