What Rails deployment tools do you use for DB management?

Posted on June 8, 2007. Filed under: deployment, mysql, Rails, Ruby, Ruby on Rails |

When it comes to deployment in Rails, the one and only tool that comes to mind is Capistrano. No doubt that is the only tool that helps with deploying source code updates to your production server/s. By the way, don’t you just love the Capistrano logo 😉 Anyway, back to what I was trying to talk about – what about tools for DB management. I’m sure some people would have came across a situation where they have huge set of tables that are non-user related data, maybe such as information of all the hotels in the US, say … if you building a hotel search engine. How are you doing DB migration then for the data?

The only way and the way that I’m doing right now with our MySQL db is using

mysqldump with several options ..

and then scp onto the production DB server. And then

mysql -u root -p your_db < backup.sql

I’m sure if you are a linux shell script geek, you would have already written up a script to do the backups of your DB in an orderly manner. But not everyone is that geeky. So as Capistrano gets better with 2.0, maybe it’s time to think about incorporating better DB data and backup management into Capistrano. What do you think? Would you buy that? Or perhaps there is already a good tool out there for doing this, if so please drop a note. Thanks!


Make a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

2 Responses to “What Rails deployment tools do you use for DB management?”

RSS Feed for Blogging Rails Comments RSS Feed

Why not use Rails built-in database migrations?

I’m assuming that you are referring to loading the data through yml fixtures.
has a good tutorial on how to do it that way.

But if you have a table with hundreds of thousands of records, which I’m now dealing with a table with 500,000 records, I would imagine that it would take a much longer time than just a mysql import. True?

Where's The Comment Form?

Liked it here?
Why not try sites on the blogroll...

%d bloggers like this: