Watch Me Get Grumpy -- Zend Expressive Doctrine Configuration

March 17th, 2019

I am in the process of starting the dreaded Rewrite Of An Existing Application That Works. In this case, it is time that I turned OpenCFP from a install-it-yourself web application into a Software-As-A-Service offering.

As part of this rewrite I have decided to use CQRS and Event Sourcing instead of the traditional CRUD-backed-with-a-DB architecture that most of the web is built on.

I believe that an application that has so many domain-specific events associated with it will benefit greatly from the ideas underpinning CQRS and Event Sourcing. Anyway, the architecture is not up for debate since I'm the one doing it!

This app is going to replace what I already created at OpenCFP Central and I will cut over to this new one once I have implemented the two existing features:

  • allowing people to register accounts
  • allowing people who are running OpenCFP to use OpenCFP Central for single-sign-on

Because there is so much reworking to be done with the OpenCFP code base to make it a SaaS capable of hosting multiple events, I felt it was better to start fresh with the code base. Especially because I now need to add all the CQRS+ES implementation.

The existing version of the application is a standard CRUD-backed-with-a-DB that was built using Laravel. My research into figuring out how to add CQRS+ES led me to believe that I did not have the requisite knowledge of the framework to figure out how to make it work. Laravel is great in that it has lots of packages and add-ons to allow you to quickly build something. I felt like this was not going to help me in this case. Laravel is good! But not a great fit for someone with my level of expertise with it.

So I decided to use Zend Expressive as the framework to build this app. My online network of friends includes many people who have used the framework, and one of the best and most thorough examples of how to build an application using CQRS+ES was done by Marco Pivetta and it was backed by Zend Framework and uses Prooph for CQRS+ES functionality.

(As an aside, using Zend Expressive has reminded me how much I have relied on 'batteries included' frameworks in recent years. Forcing myself to also write glue code is actually a good thing for me)

So, I knew the framework, I knew what I could use for CQRS+ES. Now it was time to install some other tools to help me build out this version of OpenCFP.

I was going to require some sort of tool to create database migrations as the app gets built. I was also learning towards trying not to use an ORM but instead something like Doctrine DBAL so I decided to also use Doctrine Migrations since it can be used with our without the ORM.

I found some great examples of how to set things up...and it just wouldn't work for me. The steps seemed straightforward and I highly recommend watching Adam Culp's Beachcasts tutorial on configuring Doctrine ORM and DBAL. I had my database configured and working. I added in the code to allow the Zend Service manager to locate Doctrine as required. The examples said "this should work just fine with DBAL." I had the 'migrations.php' and 'migrations-db.php' file and it Just Wouldn't Work.

Until I realized the key critical thing I had hand-waved and did not think anything off -- environment variables.

The app is going to be deployed to Heroku, where I can set environment variables that can be accessed by code, both in a CLI and web environment. I use environment variables in my work at the day job so why wouldn't I do that here?

This is what my 'migrations-db.php' file looked like:

<?php
declare(strict_types=1);

return [
    'driver' => 'pdo_pgsql',
    'dbname' => \getenv('DB_DATABASE'),
    'user' => \getenv('DB_USER'),
    'password' => \getenv('DB_PASSWORD'),
    'host' => \getenv('DB_HOST')
];

When I would run the migration tool it would spit out errors telling me it could not read the database configuration file and a bunch of other noise that just made me grumpier and grumpier as I struggled to figure out what was wrong.

Eventually I decided to see what as actually inside those environment variables. To my surprise there were empty! Ugh. But I did know what I could do to fix it. I would make use of Vance Lucas' dotenv tool to make sure the contents of my own '.env' file would be available.

After installing it using Composer as per the documentation, I added this code to my 'migrations-db.php' file:

use Dotenv\Dotenv;

if (file_exists(__DIR__ . '/.env')) {
    $dotenv = Dotenv::create(__DIR__);
    $dotenv->load();
}

Now the migrations tool worked just fine, and I was on my way towards the first step of the app -- building the user registration system and making sure authentication worked correctly.

If you have any comments or suggestions, please reach out to me via Twitter (my preferred way) or you can email me at chartjes@grumpy-learning.com.

Docker as a testing tool

January 31st, 2019

Yes, Docker can be used as a tool to help you out with some testing problems. Let me show an example of how it makes some load testing easier.

One of the projects I support at Mozilla is the push notification service that we run. Among the tests that I do for them is a load test using an internal tool that the developers of the service created. It is a Twisted application that runs using PyPy.

No, I do not know why they made those choices. Further more, it doesn't matter because I have to use it so I roll with the weirdness.

Installing it locally on my MacBook Pro was straightforward -- the docs even cover expected weirdness with some support libraries. But now that I have switched over to using Windows at work I ran into some difficulties getting all the dependencies installed in Windows Subsystem for Linux.

In fact, one of the developers uses Windows...and told me of the experience he had at failing to get it to work. "I just use VMWare Player and an Ubuntu VM".

I tried to get that working...and ran into a bunch of issues where I had to disable something called "Windows Defender Credential Guard". Which bugged me because I like to keep security mechanisms in place normally. I followed instructions FROM MICROSOFT on how to do it and VMWare Player still wouldn't run. Kept giving me the same error.

Okay, on to plan B -- Docker.

The first question was "how do I make this work?". I theorized I needed to do something like this:

  • find a base container of Ubuntu 18
  • get inside a running version of that image
  • install all the required dependencies
  • verify that the load testing tool works
  • make a copy of that container
  • push that copy up to Docker hub for teammates to use

Some searching revealed that I could get a running Ubuntu container and connect with it:

docker run --rm --it ubuntu

This downloaded the image and gave me access to it via a Bash shell.

With that shell, I started installing all the packages I would need to install the load testing tool, including adding the package repository for PyPy so I could install it with the package manager.

Once I got all the dependencies installed, I made sure that the tests could run. I even found out that there was a bug in the documenation. ;)

With a Docker image that had a working installation of the load testing tool, the next step turned out to be more straight forward than I had thought.

The Docker feature I needed was called snapshots. Here is what I did.

With the Docker container still running, I opened up another shell (in this case I am using Powershell) and used the command docker ps to get the name that had been assigned by Docker to my running container.

With that name I used the following command to save a version of that container:

docker commit name_of_container ubuntu:ap-loadtester 

Then I tagged the container so I can push up to Docker hub under my account:

docker tag ubuntu:ap-loadtester chartjes/ap-loadtester

Last, I pushed it up to Doker hub

docker push chartjes/ap-loadtester

So now I have:

  • an Ubuntu container I can share
  • that has all the dependencies installed
  • can be used by teammates to load test the push service

Again, I emphasize that a tester with some programming experience can create some really useful testing tools by leveraging the same tools and environments used to build whatever you are testing.

Learn To Test Like A Grumpy Programmer - Part 2

January 16th, 2019

(If you use PHP, you can learn how to write automated tests for your code via my "Learn To Test Like A Grumpy Programmer" course over at LeanPub)

In this blog post I wanted to talk about some of the lessons I've learned at the day job about how to use tools and techniques we are familiar with in the developer world to make our job of testing things easier.

Mozilla's Push Notification Service

One of the projects I do QA work for is the push notification service that Mozilla runs. Yes, I know a lot of people get supremely annoyed by push notifications. Yes, they are heavily abused by people. Mozilla tries to use them in a way not designed to annoy you. But this is the Internet after all, where everything annoys somebody somehow.

My work for that team involves doing testing of the service whenever a new version is ready to be released. The process goes like this:

  • the development team tags a new release along with a changelog
  • the operations team deploys this new release to the staging environment
  • I create a BugZilla ticket to track the deployment and testing results
  • I run a series of tests against the service on staging, recording my progress there

If all my tests pass on staging, I give approval to deploy that version of the service to production. The process for that is:

  • the operations team deploys the new release to production
  • I create another ticket to track the deployment process and testing results
  • I run the same tests and add an additional set of load tests to make sure the service is responsive

So what tests do I run?

API Contract Tests

I wrote some tests using pytest that make API calls using known values and verify that we are getting the responses that we expect.

These tests usually are not difficult to write and I did experiment with making them asynchronous using pytest-asyncio so I could learn how asynchronous code works in Python. If you had a very large test suite, I could see it being useful to speed up the process. This particular test suite is not big enough to warrant that.

For my fellow PHP folks, there are some plugins for PHPUnit that can run your tests in parallel. For one example check out PHPChunkit

Load Tests

I only run the load tests on the production version of the service because the number of nodes that handle push notification requests is higher in production. These tests are designed to put some non-trivial load onto the system and examine the output from the nodes for any error messages.

VAPID Testing

I use two different Android devices for these tests. Each device has latest stable and nightly releases of Firefox on them. One device is configured to point at staging, the other at production.

A web page is loaded that uses Javascript to generate VAPID-based notifications.

In the summer of 2018 I did experiment with seeing if I could automate these tests so I could use an online service offering cloud-accessible Android images. I did make some progress but it seems like some of the pieces needed to do things like make sure I don't have to click on the browser to accept notifications are either really brittle or don't exist, depending on what automation tools you are using.

Desktop Notification Tests

The person who did this testing work before me created a Nodejs app that serves up an web page with some forms and buttons on them. You fill in some values and it creates a series of push notifications that you visually verify work. He then put it inside a Docker container to make it easier for others to run those tests.

This test used to be on a personal web page of a (now-former) Mozilla employee.

Desktop WebPush Tests

These tests are to make sure that a feature of the service where you can group WebPush messages together by topic, only displaying the last one, works correctly.

The process for these tests are:

  • checkout a GitHub repo that contains the test
  • configure everything according to the documenation
  • start up the "topic server" via the CLI, which is serving up a one-page Cyclone web application
  • click on a button fire up a service worker that receives requests to send a WebPush notification
  • then use another CLI tool to fire off notifications (with or without a topic)
  • visually verify that the notifications appeared

Given that I already had a Docker-based solution, I decided it was time to turn this test into something where I only have to click a few buttons instead of running things via the CLI.

The first step was to update the existing one page app to have two more buttons. So I edited the HTML to add some buttons and then added some JavaScript that made calls to two new URL's for the app.

I then added those two URL's to the Cyclone app, porting over code from the CLI "topic pusher" tool to generate the notifications.

There was already a Dockerfile for the application, so I used that and then spent some time building and rebuilding and debugging the application to make sure it did actually make it so all I had to do was the following:

  • download the Docker image
  • start running the image (making sure to tell Docker to forward the correct port)
  • click on the Subscribe button and see the message that the service worker is running and the browser asks to accept notifications
  • click on the button that sends a single notification and visually verify output
  • click on the button that sends multiple notifications grouped by topic and visually verify output

So now I have a test that I can point other folks to that they can use without having to install any dependencies in their local environment other than a web browser and Docker.

Programming Skills + QA work == Solving Interesting Problems

I am far from the only person who has this skill set. But having some ability to create your own purpose-specific tools means that the people around you get to benefit.

Often the tools (and automation solutions) that folks use to test things are proprietary and not open to be modified. So you expend a lot of energy trying to bend a tool towards a new purpose.

There are other folks out there like me who busy creating wrappers around hard-to-use tools or creating new solutions with the goal of making what used to be difficult a lot easier. Encourage those people and promote what they are doing!