Upload Behat screenshot to Imgur on scenario step failure

Being able to save a screenshot from Behat via one of it’s drivers is very useful to find out what went wrong, especially on headless browsers such as PhantomJS. Previously I’ve saved these to the filesystem as per a handy UCSF guide and updated myself for Behat 3.

However if you do not have easy access to the filesystem where Behat is being run, such as your Continuous Integration server (especially container-based solutions like Travis where it will get destroyed after each build) then it may be handy to upload it to somewhere and output the URL in your build logs. This is a quick solution for uploading to Imgur.

You will need to create an Imgur account and register your application to obtain a client ID.

Please note this is a quick and dirty example, as it should ideally use Guzzle to talk to the Imgur API and use external configuration, but since people may want to use different image hosting services or methods I left it fairly basic for now.

Save a screenshot in Behat 3 on a failed scenario step

If you are using Behat with Selenium server and running locally, if is firing up real browser instances via web driver then it is easy to see what may be going wrong. However if you are using a headless browser such as PhantomJS this makes it impossible, unless you take screenshots when an error has occured (a failed step in your scenario)

I used a handy UCSF guide for taking screenshots with Behat 2 and updated it for Behat 3

Fix SSL certificate problem using Behat, Guzzle and Goutte on localhost

If you have an application that you are trying to test locally (such as on a Vagrant VM) or on a development server that has a self-signed certificate, Behat will probably complain of an SSL certificate problem because of an invalid certificate chain (GuzzleHttp\Exception\RequestException)

This is because Guzzle (the http client used by Goutte, the default Mink driver) believes the connection to be insecure, which technically it is, but for testing purposes we can ignore this.

If your application (or stack) is configured for https only at all times this means you can’t switch to http to test, since that would be testing configuration that is not representative of your live environment. To circumvent this you can disable SSL verification for cURL in Guzzle.

 

 

Javascript redirects double-encoding GET query parameter such as arrays [] %255B %255D

If you are passing query parameters such as arrays, Apache may double-encode them so that %5B and %5D becomes %255B and %255D for URLs that are submitted such as:

?a[b]=1&a[c]=2&d[]=3&d[]=4&d[2][e]=5

http://api.jquery.com/jquery.param

If you are making Ajax or API calls from a Javascript library (such as via jQuery or Angular) then it will encode the URI initially, then the web server (such as Apache) will 301 or 302 redirect to the new URI and the Javascript library will encode them again, resulting in double-encoded characters.

If you have redirect rules such as http to https or trailing slash to non-trailing slash then you will need to update the URIs you are making requests to in the Javascript calls.

Haven’t confirmed if Nginx or other web servers are affected by this yet

Use Homebrew PHP with Mac OS X built-in Apache

To use the version of PHP you installed with Homebrew, you will need to change the PHP extension that Apache is loading. If you are using the built-in version of Apache this will use the built-in version of PHP even after you have installed PHP with brew.

brew install php56

You need to edit the Apache config file /etc/apache2/httpd.conf and search for libphp5.so

vim /etc/apache2/httpd.conf

Then change this to point to the new brew PHP extension

#LoadModule php5_module libexec/apache2/libphp5.so
LoadModule php5_module [new-extension]

The location of the new brew PHP extension will differ depending on your setup, but you will be able to find it using brew

brew info php56

Use CORS for REST API via XDomain proxy in IE8, IE9 for JavaScript/Angular

Because Internet Explorer 8 and 9 don’t support CORS properly (no custom headers such as an API key, only GET/POST, etc.) you may have problems using a modern REST API. However Jaime Pillora came up with a great solution called XDomain which acts as a pure JavaScript proxy for your API calls to get around CORS. This will only work if you have control over both the API you’re consuming and the JavaScript application.

Put this page in the web root of your API…

All you need to do to get this setup is to make sure you include it before any other JavaScript (such as Angular or jQuery) that will use IE’s XDomainRequest, as it will act as a drop-in replacement for it.

Your application would then look something like this…

Using BrowserStack or Modern.ie with the (limited) IE developer tools you will either see some debug messages from XDomain, or if you see “Access Denied” it will be because XDomain can’t connect to your proxy page (check the http/https protocols match and that the page is accessible on your API from where your JavaScript application is hosted)

Using strings in SQL select clause in CodeIgniter DB helper

If you need to use literal strings in your select clause (such as a date format) using the CodeIgniter DB helper, you will need to pass false as the second parameter

$this->db->select(
    'DATE_FORMAT(date, "%Y-%m-%d") as something_date', 
    false
);

Quick way to enable MySQL General Query Log without restarting

If you want to quickly enable the MySQL General Query Log without restarting MySQL Server, you can run these couple queries to start outputting all queries to a file located on disk

You can then run a command like “tail -f /var/log/mysql/all.log” to see the queries appear as your application runs. Note never enable this on anything other than a development environment, and make sure to turn it off once you are finished debugging or the disk will fill up

How to solve file exceeds GitHub’s file size limit of 100 MB

You may encounter a problem with GitHub if you are using it as a new remote (or a pull request for a different remote) whereby a file in the history is over 100MB. This means that although your working copy doesn’t have the file, it was there at some point in time

remote: error: File dump.sql is 221.82 MB; this exceeds GitHub's file size limit of 100 MB

Therefore you will need to find out where in the history the file exists, and rewrite history so that it doesn’t. Thus you will need to git rebase and remove them.

First find out which commits the file exists in

git log --all -- dump.sql

Then check which files were touched in each of those commits (for each commit)

git show --name-only xxxx

This gets slightly complicated if the commits contain other modified files, otherwise you may need something like David Underhill’s script

Then if you rebase from the commit before the one you need to remove. An easy way to do this is to use an interactive rebase

git log
git rebase --i xxxx

You may end up with commits that deleted the file later, which are now empty so would effectively not be required. You can keep them as part of the history using

git commit --allow-empty
git rebase --continue

You should then have a copy of your repository with that file no longer present. As mentioned this is simple way to do it, but if you need to untangle a file from larger commits you may need the script linked above

Newer posts
Older posts