17 December 2011

After explaining which tools were used to generate this static website, how to install them and how to design the templates, this post explains how to automate the website generation and uploading on Amazon S3.

The automation steps are quite simple but requires the installation of another tool for step 2 and step 3. For this I chose the open source tool s3cmd

  1. generate the static site by running jekyll
  2. synchronize the _site directory created at the previous step with the website bucket on Amazon S3 using s3cmd
  3. backup the local blog directory on Amazon S3, in two different regions (you never know! Future enhancement will be to back it up on another cloud storage provider such as Azure and/or Google)

Installing s3cmd

To be documented…

And finally the script!

Here is the batch script I use. It could be enhanced but it does the job!

@echo off

REM site generation and upload script
set PYTHONDIR=d:\_blog\tools\PortablePython
set BLOGROOT=d:\_blog\itisopen

set S3CMD_INI=%AppData%\s3cmd.ini
if not exist %S3CMD_INI% (
  echo ERROR: s3cmd ini file not found. Please run 's3cmd --configure' first
  exit /b 1


echo Generating site with jekyll
@call jekyll 

echo Synchronizing website
call %PYTHONDIR%\App\python %PYTHONDIR%\App\Scripts\s3cmd sync --no-progress --delete-removed _site/ s3://www.itisopen.net/

echo Backing up blog on S3 (USA)
call %PYTHONDIR%\App\python %PYTHONDIR%\App\Scripts\s3cmd sync --no-progress --delete-removed --exclude _site/ ./ s3://<bucket US region>/blogs/itisopen/

echo Backing up blog on S3 (EU)
call %PYTHONDIR%\App\python %PYTHONDIR%\App\Scripts\s3cmd sync --no-progress --delete-removed --exclude _site/ ./ s3://<bucket EU region>/backups/blogs/itisopen/


blog comments powered by Disqus