Laravel Homestead, database provisioning

I have started to work with Homestead for a Laravel project that we are just starting, we have been using vagrant for a couple of years. In that time, to provisioning we have decided to use Chef, why?, because we thought that since chef uses ruby and vagrant uses ruby, will be easier to just learn ruby than any particular language to create the provision scripts.

Since Laravel has this already well known vagrant machine, instead of reinventing the wheel, we have decided to just use Homestead instead of create our own Vagrant machines with our own recipes. We are thinking of doing the same with Drupal VM for further projects with Drupal.

One of the things that we have in our current recipes is a step to provision the database of the application in the Vagrant machine looking for a fresh backup in our Backups repository.

We have process that two times in the day run and it create backups for all our databases, all environments. Our chef recipes look for the latest dump and then downloads the dump into the vagrant machine to uncompress it and import it in the local database. With that we have a good mechanism to have fresh data in our working environments.

I haven’t found something like that for Homestead, so I have tried to modify the provision to add this step into this machine.

I found that Homestead uses just plain ruby and does some cool stuff extending the Vagrantfile and calling shell scripts for each task when is needed.

My first idea was to modify the Homestead.rb file, however that file is into the vendor folder (I’m using one homestead for each project) and if a do a change there I’ll get in troubles with composer.

The plan B was to modify the Vagrantfile directly, so that I what I did using as sample the code in Homestead.rb.

The first step was to setup some parameters in the Homestead.yaml file, I added an array of possible databases dumps to download and in which database I need to import those dumps. I did something like this

  - db: mydb
    dump: mydb.dump
    auth: myuser:mypass

The database repository is accessible via HTTP and requires a Basic auth credentials, that is very practical to get the dump just with curl or wget (REST for the win), that is why server and auth is needed; db is the name of the database where the dump will be imported; Dump is the name of the dump file (without extensions).

A sample url to get the database will be

  http://myuser:[email protected]/dbs/20170725-1502/mydb.dump.sql.bz2

The idea is to build that url, download the file to the vagrant machine, uncompressed the file, and import the sql dump.

In order to do that, in the Vagrantfile I have added these lines to iterate over the config file and get the dumps. Add these lines after the Homestead.configure.

    //add this after Homestead.configure(config, settings)

   if settings.has_key?("db_dump")
        settings["db_dump"].each do |db|
            config.vm.provision "shell" do |s|
       = "Importing MySQL Database: " + db["db"]
                s.path = "scripts/"
                s.args = [db["db"], db["server"], db["dump"], db["auth"]]

These lines, for each dump, will call the shell script and will use the parameters defined in the args array.

Now, in the root of the laravel project, we need to create a scripts folder and then add the script. This script will do the heavy work.

#!/usr/bin/env bash

#move the args to named variables

echo "Downloading data to $DUMPFILE from $DB";

#this dark line gets the date of the latest backup, isn't a genkidama
DATE=`curl --silent "https://$AUTH@$SERVER/dbs/" | grep -e '-' | sed 's@[<a href="]@@g' | sed 's@/>.*@@g' | tail -1`;

#builds the url

#gets the file to local
wget -q -O /tmp/seed.sql.bz2 $URL;

#uncompress the file
bunzip2 /tmp/seed.sql.bz2

#imports the dump
echo "Importing data to $DB from $DUMPFILE";
mysql -uhomestead -psecret $DB < /tmp/seed.sql

That’s it, now I have my provision complete!

I don’t think you can use this post just like I did, however, I expect this inspire you to implement your own mechanism. If you have some ideas how to provision the database I’ll love to ear it.

Cya in the next post.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.