liquidfish current en-US daily 1 Creating a Simple Laravel Docker Environment Wed, 05 Jun 19 12:04:22 -0500 Since I started at liquidfish a little over a year ago I was introduced to the wonderful world of Laravel Homestead. Laravel Homestead is great, it supports multiple php versions, xDebug, MySQL and allows the user to install any other required software straight into the virtual machine.

This is a tried and tested method of local development that has worked incredibly well for our development team for many years, but as we continue to grow we are always looking for ways to improve our development and deployment process, enter docker.

- Docker is a platform for developers and sysadmins to develop, deploy, and run applications with containers. -

Today we are going to create a dead simple docker environment to run Laravel 5.8 using PHP 7 with xDebug and Mysql on Windows 10 WSL.

My current development environment:

  • Windows Subsystem for Linux (WSL)

    • My main hard drive is mounted under /c/ in the WSL

  • ConEmu using ZSH hooked up to WSL

  • Docker for Windows

  • Docker & Docker Compose installed on WSL


As long as you have Docker & Docker Compose running on your machine you will be able to follow along with this tutorial. Let’s get started!

First create a new project folder for our docker environment, mine will be under /c/docker/laravel.

Now create the following directories inside the project folder

  • docker
    • nginx
    • php
      • config
  • www

The docker folder contains three folders, nginx, php, and mysql. These folders will hold Docker related files, Dockerfiles, .inis, configuration files, etc.

The www folder will be our environments web root where laravel will be installed.

First we are going to create a .env file in the root of the project to hold our Docker configuration variables that our compose file and PHP Docker file will use. The .env file will allow for easier flexibility in the event you need to change your environment.

# Docker Compose Environment Variables

# APP Environment

# PHP Version

# Local working directory webroot

# Remote working directory webroot


Now lets create our docker-compose.yml file inside the root of our project and add the following services to it.


version: "3"

 # App php-fpm service
     context: ./docker/php
       APP_ENV: ${APP_ENV}
   container_name: app
   restart: unless-stopped
     - ./docker/php/config/xdebug.ini:/usr/local/etc/php/conf.d/xdebug.ini
   env_file: .env
     - 9001:9001
     - app-network

   image: nginx:alpine
   container_name: nginx
   restart: unless-stopped
     - ./docker/nginx/nginx.conf:/etc/nginx/nginx.conf
     - ./docker/nginx/conf.d/:/etc/nginx/conf.d/
     - ./docker/nginx/ssl/:/etc/nginx/ssl/
     - 80:80
     - 443:443
     - app
     - app-network

   image: mysql:5.7
   container_name: database
     MYSQL_DATABASE: 'laravel'
     MYSQL_USER: 'user'
     MYSQL_PASSWORD: 'secret'
     MYSQL_ROOT_PASSWORD: 'password'
     - 3306:3306
     - database-volume:/var/lib/mysql
     - app
     - app-network

# Docker Volumes

# Docker Networks
   driver: bridge

Docker-compose.yml explanation

  • Version

    • The version of the docker-compose file

  • Services

    • Defines the services that will be ran on docker-compose up

  • App - PHP-FPM service

    • Build - We are going to be building a docker file

      • Context - The location of the docker file

      • Args - Variables we want to use in the Docker file from the .env

        • APP_ENV - Docker environment

        • PHP_VERSION - Version of php

        • REMOTE_WORKING_DIR - Remote working directory laravel will live

    • Container_name - Name of the container

    • Restart - Service will always restart unless it is stopped

    • Volumes - Local directories we want to mount and files we want to mount to the service

      • ./www (LOCAL_WORKING_DIR) to /var/www/html (REMOTE_WORKING_DIR)

      • Local xdebug ini file

    • Env_file - location of our env file

    • Ports - Ports we want to expose to the outside

      • 9001:9001

    • Networks - Internal app-network for inter-container communication

      • App-network

  • Nginx - Nginx service

    • Image - Nginx image we are using

      • Nginx:alpine

    • Container_name - name of container

      • Nginx

    • Restart - Service will always restart unless it is stopped

    • Volumes - Local directories we want to mount and files we want to mount to the service

      • ./www (LOCAL_WORKING_DIR) to /var/www/html (REMOTE_WORKING_DIR)

      • ./docker/nginx/nginx.conf:/etc/nginx/nginx.conf

        • Default nginx configuration file

      • ./docker/nginx/conf.d/:/etc/nginx/conf.d/

        • Nginx site configuration files

      • ./docker/nginx/ssl/:/etc/nginx/ssl/

        • SSL directory

    • Ports - Ports we want to expose to the outside

      • 80:80

      • 443:443

    • Depends_on - Nginx depends on the app service to start

    • Networks - Internal app-network for inter-container communication

      • App-network


  • Database - Mysql service

    • Image - Mysql image we are using

      • Mysql:5.7

    • Container_name - name of container

    • Enviroment - Environment variables the image will use

      • MYSQL_DATABASE: 'laravel'

      • MYSQL_USER: 'user'

      • MYSQL_PASSWORD: 'secret'

      • MYSQL_ROOT_PASSWORD: 'password'

    • Ports - Ports we want to expose to the outside

      • 3306:3306

    • Volumes - The data volume for mysql, this will allow data to persist.

      • database-volume:/var/lib/mysql

    • Depends_on - database service depends on the app service to start

    • Networks - Internal app-network for inter-container communication

      • App-network

  • Volumes: - Volumes to create
    Database-volume: - Mysql data volume

  • Networks - Networks our services are using

    • App-network - name of network

      • Driver bridge

        • Bridge network is used for inter-container communication

Next up is creating our php Dockerfile


# PHP Version environment variable

# PHP Version alpine image to install based on the PHP_VERSION environment variable
FROM php:$PHP_VERSION-fpm-alpine

# Application environment variable

# Remote working directory environment variable

# Install Additional dependencies
RUN apk update && apk add --no-cache $PHPIZE_DEPS \
   build-base shadow nano curl gcc git bash \
   php7 \
   php7-fpm \
   php7-common \
   php7-pdo \
   php7-pdo_mysql \
   php7-mysqli \
   php7-mcrypt \
   php7-mbstring \
   php7-xml \
   php7-openssl \
   php7-json \
   php7-phar \
   php7-zip \
   php7-gd \
   php7-dom \
   php7-session \

# Install extensions
RUN docker-php-ext-install pdo pdo_mysql
RUN docker-php-ext-enable pdo_mysql

# install xdebug and enable it if the development environment is local
RUN if [ $APP_ENV = "local" ]; then \
   pecl install xdebug; \
   docker-php-ext-enable xdebug; \

# Install PHP Composer
RUN curl -sS | php -- --install-dir=/usr/local/bin --filename=composer

# Remove Cache
RUN rm -rf /var/cache/apk/*

# Add UID '1000' to www-data
RUN apk add shadow && usermod -u 1000 www-data && groupmod -g 1000 www-data

# Copy existing application directory permissions
COPY --chown=www-data:www-data . $REMOTE_WORKING_DIR

# Change current user to www
USER www-data

# Expose port 9000 and start php-fpm server

# Run php-fpm
CMD ["php-fpm"]


Dockerfile Explanation



    • sets the arguments supplied in the docker-comopose.yml for use within the Dockerfile

  • FROM php:$PHP_VERSION-fpm-alpine

    • is installing the php-fpm alpine image supplied through the variable

      • Alpine Linux is an independent, non-commercial, general purpose Linux distribution designed for power users who appreciate security, simplicity and resource efficiency. -

      • Using Alpine Linux will help keep our image sizes down

  • RUN apk update

    • Is updating Alpine Linux and then adding our dependencies, if you need additional php extensions or linux packages this is where you would add them.

  • RUN docker-php-ext install and enable

    • install the php extension into docker and then enable it so we can edit the configuration of these extensions. For the sake of simplicity I am only install pdo. If you need a more robust configuration you would install and enable the extension here.

  • RUN if [ $APP_ENV = “local” ]

    • we are only installing xdebug if we are on a local development environment, there is no need to install it on staging or production.

  • RUN curl -sS composer

    • We are installing composer for laravel

  • RUN rm -rf /var/cache/apk/*

    • we are cleaning up the apk cache

  • RUN apk add shadow && usermod

    • we are adding the UID 1000 to www-data for php-fpm

  • COPY --chown=www-data

    • we are copying the application directory permissions

  • USER www-data

    • changing the current user to www

  • EXPOSE 9000

    • exposing port 9000 for internal use within the containers

  • CMD [“php-fpm”]

    • start php-fpm

Now lets create our default configuration files

Create a nginx.conf file in the root of our docker/nginx folder with the following contents.

pid /run/;
worker_processes auto;
worker_rlimit_nofile 65535;

events {
multi_accept on;
worker_connections 65535;

http {
charset utf-8;
sendfile on;
tcp_nopush on;
tcp_nodelay on;
server_tokens off;
log_not_found off;
types_hash_max_size 2048;
client_max_body_size 16M;

include mime.types;
default_type application/octet-stream;

# logging
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log warn;

ssl_session_timeout 1d;
ssl_session_cache shared:SSL:50m;
ssl_session_tickets off;

# Diffie-Hellman parameter for DHE ciphersuites
ssl_dhparam /etc/nginx/dhparam.pem;

# OWASP B (Broad Compatibility) configuration
ssl_protocols TLSv1.2 TLSv1.3;
ssl_prefer_server_ciphers on;

# OCSP Stapling
ssl_stapling on;
ssl_stapling_verify on;
resolver valid=60s;
resolver_timeout 2s;

# load configs
include /etc/nginx/conf.d/*.conf;

Create a site.conf file in the root of our docker/nginx/conf.d folder with the following contents. This is a standard laravel nginx config.

server {

   listen 80;
   listen [::]:80;

   # For https
   # listen 443 ssl;
   # listen [::]:443 ssl ipv6only=on;
   # ssl_certificate /etc/nginx/ssl/default.crt;
   # ssl_certificate_key /etc/nginx/ssl/default.key;

   server_name test.local;
   root /var/www/html/public;
   index index.php index.html index.htm;

   location / {
        try_files $uri $uri/ /index.php$is_args$args;

   location ~ \.php$ {
       try_files $uri /index.php =404;
       # We are using our app service container name instead of as our connection
       fastcgi_pass app:9000;
       fastcgi_index index.php;
       fastcgi_buffers 16 16k;
       fastcgi_buffer_size 32k;
       fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
       #fixes timeouts
       fastcgi_read_timeout 600;
       include fastcgi_params;

   location ~ /\.ht {
       deny all;

   location /.well-known/acme-challenge/ {
       root /var/www/letsencrypt/;
       log_not_found off;

   error_log /var/log/nginx/laravel_error.log;
   access_log /var/log/nginx/laravel_access.log;

I am using test.local as my local domain, be sure to add test.local (or your preferred local domain) to your hosts file.

Create a xdebug.ini file in the root of our docker/php/config folder with the following contents.


The big takeaway with this xdebug configuration is setting connect_back to 0, setting your idekey to PHPSTORM, and setting your hoist to host.docker.internal. I ran into issues with enabling setting remote_connect_back instead of using the remote_host. This has been tested on Windows 10 WSL and has been working really well. If you can’t get xdebug to work with that configuration you may need to search around to find one that will work for you.

Lets build it

To build the environment make sure you are in your projects root folder and run docker-compose build and let it run. When it is done you should  see the following output.

Now that PHP is built lets run docker-compose up -d this command starts all of the containers in detached mode, if you want to be able to run the containers while looking at the output logs just run docker-compose up. If you are in detached mode and want to view the logs for a specific container you can run the following command, docker-compose logs app (container name.)

Installing Laravel

Now that are containers are running lets install laravel, run the following command to jump into the container docker-compose exec app bash.

Now that we are in the container run the following command to install laravel composer create-project --prefer-dist laravel/laravel . Setting the directory to the (.) just means we are going to install laravel inside the current directory.

Great Laravel is installed, now open up test.local (or the domain you set in the nginx config / hosts file) and you will see the Laravel welcome screen!

Setting the laravel mysql environment variables

Just like in the Nginx config where we set the fastcgi_pass to app:9000 we are going to do a similar thing to the mysql environment variables.


Now that we have the environment variables set lets exec back into our container (docker-compose exec app bash) and run the default migrations (php artisan migrate)

Creating a PHP Remote Debug configuration in PHPSTORM

To enable use xdebug in PHPSTORM follow these steps

  • Top right of PHPSTORM there is a drop down, click add or edit configuration

  • Set the name (Can be anything)

  • Select Filter debug connection by IDE key


  • Click the servers button to the right of the server dropdown

  • Set the name of the server (Can be anything)

  • Set the host to your local site url (mine is test.local) Port 80, Debugger Xdebug

  • Open the File/Directory selection until you see the www folder

  • On the right where it says absolute path on the server click to edit it

  • Enter /var/www/html which is the remote path and hit enter on the keyboard to save it

  • Apply the changes then hit ok on the servers menu and then the debug menu

To test this go into the web.php routes file and add the following code

Route::get('/', function () {
   $word = 'Debugger';

   $word .= ' In Action';
   echo $word;

Break on the first $word variable and you will now be able to step through the code!

Sweet! Docker is setup, Laravel is installed and configured, our migrations have been ran, and our xDebug configuration for PHPSTORM has been setup!

Now all that's left to do is get building!

All Hail the Office Alternate Tue, 14 May 19 10:22:29 -0500 When you hear the title “Executive Assistant,” you probably think coffee, cleaning, sticky notes, and some sort of bowl with goodies in it. If we’re being honest, you’re not wrong. Being an assistant DOES include all of these things. What people don’t realize, however, is how much more the job title actually consists of.

To better explain what it truly means to be an assistant, I feel as though I must turn to the world of sports. We all know the term “alternate,” but unless you’ve been one, you probably don’t understand how important their role is to the success of the team. 

As an alternate on a sports team, you have to be ready to jump in anywhere, at any time, for any team member. In an office setting as an assistant, it’s not much different.  You jump in whenever, wherever, for whomever, and you get it done (even if you have no idea what you’re doing). If something needs to be scanned, you scan it. If an invoice needs to be paid, you pay it. If the 15 grown men you work with need snacks and candy to function, you go get them snacks and candy.  If the walls need to be painted, you paint them (even in your vintage leather boots).

We’re VIP members at Sam’s Club, Amazon Prime experts, and the only people without a barista title that can sling coffee like no one’s business.

As assistants, “coming through in the clutch” is literally our job. We fill in the holes where something is missing. We’re the rookies that make the buzzer beater at the end of the game to get that W.

Whether we’re organizing, paying people, buying snacks, or wiping down the conference table for a last minute meeting, we’re always striving to make everyone’s life in the office a little easier. 

If you’re in an office setting with an assistant today, give them a little extra love. If you’re an assistant yourself, keep doing your thing and assisting your coworkers in getting that daily W.

Battle Stations Fri, 12 Apr 19 15:59:21 -0500 When liquidfish hears, "man your battle stations"  this is what we see...

Agile is Cool! Wed, 27 Mar 19 16:48:40 -0500 Many of you looking at this title might think,

“Just why is Agile cool, and what is Agile anyway?”

Here at liquidfish, we do a lot of amazing work including branding, marketing, SEO, and software development. Today I am going to talk about the latter. Some people think that software development is a completely predictable process, but this is a misconception. Let me explain.

When a client asks for an application (website, mobile, etc.), usually there is a big, exciting vision. In an ideal world, software developers would spend significant time breaking down all the pieces, documenting each details and requirements, writing a plan, and finally, by using previous experience, estimate and predict total hours and cost for each project. But to write down everything could take forever. Clients usually want to know how much a project will cost and how long it will take to produce a product before paying for anything, or entering a contract.  As you see, here we have a vicious cycle. Clients prefer to have promises before paying. Developers would like details and data before making promises. Someone needs to step out of the circle.

This is usually the project manager’s job—breaking projects down into big pieces and providing rough time estimations. Because of our team’s great experience, we do a good job of providing realistic quotes to clients.

This is a traditional process and is well known in the IT industry as the Waterfall approach (

Using this technique, we provide an estimated total cost for the project and establish the deadline for when it will be delivered. Waterfall monoliths can be ideal for short-term projects that require a strict deadline and budget. The liquidfish team is committed to being overachievers and keeping processes simple for our clients.

At this point I imagine you have a question:  “So, what is the problem? And still you have not answered: What is Agile and why is it so cool?” We will get there, I promise. First, I want to show the potential downside of this traditional process from a coder/developer’s perspective.

When Waterfall is not cool.

As our skills and expertise have grown exponentially, bigger fish with more complex projects are swimming to our current. Complex projects require many months of continuous coding, often without feedback, as clients await the final, finished product. The goal of most projects is nicely described and packaged in the beginning by project manager and clients alike. But while they look nice, these descriptions can be fraught with peril for a coder. Lots of positive emotions at project inception can overshadow and obscure issues that most of the time will not be clear at the initial planning stage. It’s no one’s fault; it’s just one of many steps of the process. And it’s normal for multiple people to have different interpretations of information. As I’m sure you can understand, if there is a misunderstanding during the initial time-estimation step, this can bring unhappiness later for both coders and clients.

Another challenge with the Waterfall approach is the potential lack of clear understanding on the coders’ part regarding the details of clients’ specific business processes. To successfully automate a process, we must know all the rules. Software requires precise checks on all conditions to make a right decision (output). Usually people don’t think in this way, and they don’t need to. In real life, the business process is easily executed heuristically. But for a program, every exception and detail must be accounted for. Of course, if detailed and unique business processes are fully known up front for each client, it can make accurate estimation a lot easier. However, this is rarely, if ever, the case. And even if it were, the understanding or desires can be changed without the coders’ knowledge.

A lot of times, our project is dependent on other services. Those services can significantly change the way our project works, or change pricing, as we would need to change our programming in a way that can interact with them. Other times, for various reasons, the multiple services are incompatible, or create environments that are completely unpredictable. We can’t remove this factor. The biggest misconception here is that software development is just about inputting a code that functions. A big chunk of the mission is to clearly understand the exact “whats and whys” of what will be written. Only then can top-notch code be efficiently designed and built. To change something after the whole project is done can be very expensive. Monolithic waterfall processes can result in wasted money and time, both for a designer and for a client. And this is a big risk that I will talk about later.

Summarizing this, we can highlight four big challenges with Waterfall approaches:

●    Unclear global goals that may change over periods of time

●    Misunderstanding of needs with lack of direct contact between coders and clients

●    Lack of knowledge of customers’ unique business logic

●    Dependencies on third-party applications

So, what is Agile?

I think you have guessed that I will propose it’s a solution that can cover these challenges. Agile is an approach/methodology that eliminates the pattern of a waterfall monolith. Many different approaches fall under the Agile umbrella, but it has a lot to do with providing micro-services throughout a project. There is even an Agile manifesto ( It’s cool, right? :)

An Agile approach has different focus points:  Scope changes and “fast failure” are encouraged. Any scope changes are processed as a natural part of development. Understanding how the process is envisioned to work can be changed with new insights or even because of some external factors (economic, political). Feedback is required at regular intervals. Feedback of real users can bring a lot of helpful and important changes in understanding. As a developer, I can say it’s always easier to make than to change something that was done a long time ago. The purpose of failing fast is to get fast feedback while correction is most efficient. Regularly presenting a demo of current status to product owners and business stakeholders helps them to make sure it is as requested, and even more important, to determine if progress is indeed in line with what they really want.

Agile processes involve iterative development. There will be many small steps toward the completion of a colossal project. Customers don’t need to wait for the totality of all massive documents to be finished for review. Finalizing of requirements can be done as the projects go along. There is a huge benefit of being able to start with tests as soon as possible. The ability to respond to feedback and adapt immediately saves everyone time and effort. This is also a benefit because final documents are usually burdensome to read anyway (if it’s not massive, then it usually doesn't include everything, resulting in a big risk of misinterpreting the requirements).

With Waterfall, there is a high upside prospective for continued business coupled with a high downside risk that can ultimately affect costs. Agile procedures remove the risk of unknown costs and delay by encouraging the principle of failing fast to allow for immediate adaptation in cost, scope, and timeframe. The idea of it is to reduce downside risk while keeping upside prospective.

Iterative development can be a little bit tricky, of course. Projects will be divided into pieces and will be complicated with various login screens, lists of users, downloadable documents throughout the process, etc. But developers can implement access levels piece by piece incrementally. Regular feedback is key. This is what makes the process iterative. Otherwise there is the risk of creating multiple mini-waterfall monoliths. Just doing sprints or iterations doesn’t make for Agile development. It’s the attention and adaptation to feedback that breeds efficiency.

Agile development allows for adaptive reprioritization in what can bring the most value to our clients. Iterative development allows our clients to receive a product before it is finished. Sub-portions of final products can be pushed to production and get tested by users, or at least run by higher managers. This would allow midstream corrections from those people whose feedback is the most valuable.

To summarize, I want to repeat the key principles of Agile:

●    Highest priority is to satisfy customer needs rather than quickly complete a project

●    Coders welcome changes early on and throughout the project

●    Frequent feedback from end-users improves and transforms project parameters

●    Iterative development is encouraged with fully transparent costs for changes

And that, dear readers, is why this software developer thinks Agile is cool! It’s fine to disagree. There are valid arguments and feedback on all sides. I’d love to hear yours.

Client Side Search with Fuse.js Thu, 14 Mar 19 14:27:02 -0500 Search is arguably one of the most important elements of a successful website. After all, it doesn't matter if the information is there if you can't find it.  Often, it is useful to implement several search strategies on a site, each of which can be optimized for the specific data being searched. Today, I would like to talk about fuse.js, a nifty little fuzzy search utility that can be easily incorporated into any site.

Fuse.js is a lightweight javascript fuzzy search library written and maintained by Kiro Risk.  As of this time, the current version 3.4.4 weighs in at a mere 10.9 KB. Fuse.js, like many of the best dev tools, is open source, released under the Apache License Version 2.0. In addition to it's small size, fuse.js also has the advantage that it has no external dependencies.

In a nutshell, fuse.js takes as its input a set of data in json format, some configuration information (also json formatted) and a search string.  To be more precise, an instance of the fuse class is instantiated with the input data and config, then the search method of that instance is called with the query as its argument.  The input should be in the form of an array of identically structured json objects since we will be telling fuse.js which parts of each object to consider in the search as a part of the configuration. Fuse.js returns an array of matches, filtered by the query string in accordance with the search settings.  These output results can be the full original objects, or if the records have a unique key, this can be identified in the search configuration and the results can simply be an array of these ids.

To configure fuse.js, you pass in an configuration object when you create your fuse.js instance.  Although there is not a huge number of configuration options, there are enough to tune the search in several ways.  Some options such as "caseSensitive", "tokenize", "location", "threshold" and "keys" allow you to specify exactly how the data will be searched.  These determine what will be taken into consideration when deciding which records are matches. Other configuration options such as "includeScore", "includeMatches" and sort affect the format of the output.  I won't go too deep into the meaning of each of these parameters, but will note that the main parameters determining the "fuzziness" of the search are "location", "distance" and "threshold".

The fuse.js homepage explains all of the parameters in greater detail, but even better, it allows you experiment with them live.  You can either use the sample data included in the page, a list of book title and authors, or paste in your own json formatted data.  As you change the selected configuration options, two output windows are updated in real time, one showing the javascript required to instantiate and run the query and the other showing the search results.  This provides a quick method for experimenting with the various configuration parameters and when you have the parameters tweaked to you liking, you can just copy and paste the configuration block into your code.  

Fuse.js might not make sense for every situation, but can be ideal depending on your search requirements.  I am currently using it in what is essentially a single page app within a larger website. Each user has their own data and the number of records per user is relatively small (currently the maximum size is under 400 records) so fuse.js is more than adequate for my purposes.  What's more, the data that I needed to search was already being loaded in the page as it was using Vue.js to generate the page. Previously, the same search was using Algolia which is a great service, but was overkill for this particular use case. With few records per user, but thousands of total users, the Algolia index was huge and growing steadily.  This, coupled with the fact that this was a convenience feature that not all users needed or used prompted me to seek an alternative and fuse.js was the perfect fit.

Fuse.js worked so well for me that I never tried any of the other javascript search libraries that are out there, but if you find that fuse.js doesn't quite cut it for you, you might give one of the the following a try:  js-search ( by Brian Vaughn or fuzzy ( by Matt York.

How Allen Iverson Convinced Me to Join an Art Show Wed, 27 Feb 19 14:45:58 -0600 In the previous year, I had some monumental changes in my life. I settled into a new job, and I found a groove with my babies that helped to ease the newness of fatherhood. After the initial shock to the system from all the change, I found myself thinking about the stagnation in my personal growth. As January 1st, 2019 came and went, I began conceptualizing ways to push myself. God (or “The Universe”) heard my inner plea, and decided to test my willingness to try something I’ve never done before; and I accepted.

A mutual friend on Facebook posted about an art show he was hosting that needed artists to submit work. I have been to a couple of his shows in the past (generally a mash up of different pop culture themes) and I’ve always left with the question: “If I were to do this, what would I do?” This time, the topic of the show —NBA legend, Allen Iverson— was enough for me to give it a shot. 

“This Bronze is Worth More Than Gold: An Art Show for Allen Iverson and All Unsung Heroes”, is a show intended to “highlight overlooked greatness. This includes athletes who never won rings, directors who never won Oscars, presidential candidates that came up short and whatever else our city's talented artists can come up with.” To me, there was no doubt on what I wanted to tackle. Allen Iverson was one of the first black athletes who was unapologetically black. Michael Jordan, Eddie Murphy, and Michael Jackson spent the 80’s proving that black people could profit millions in the sports and entertainment industry. They were phenoms that alerted Corporate America that black artists and entertainers could in fact push pop culture. Even though they were black icons, they were polished, branded entities. None of them totally embraced the culture and the essence of blackness as it stood. The 1990’s broadened that scope for more African Americans to enter the superstar stratosphere behind them, but it was a time where if you wanted to be the face of a brand, and you had a black face, you had to be clean cut and media savvy. Non-threatening towards a white audience. Iverson was not that in the slightest.

From Hampton, Virginia, Iverson was an east coast guy. At the time, east coast rap and style dominated the music scene. He was drafted in 1996 to the Philadelphia 76ers, a franchise whose fans are blue collar and traditionally known for supporting the athletes that played hard with their city’s name on their chest. He was the dubbed as the “anti-hero” during the final stretch of Jordan’s second 3-peat with the Chicago Bulls. Iverson challenged the establishment on the court as much as he did off of it. At his peak, his street style was in full display as he accepted the MVP trophy in 2001 in baggy sweats and a du-rag. That moment of unmitigated blackness prompted the league commissioner to mandate a dress policy to suppress a style that had already permeated through a league that was 80% black, in attempts to make his athletes appear more “professional”. Before Iverson; only rappers of the thuggish-ruggish variety wore cornrows. After Iverson; suburban, middle class black kids were wearing cornrows in their hair. Even some white kids had it too. He was the first in the NBA to go all out tattooed. He was the first to debut a shooter sleeve and 3/4 to full length compression tights (a style now commonplace in the NBA and the blacktop). He was the first in line of many shoot-first point guards who’s strength relied in his ball handling, quickness, and midrange shooting, in a league that would soon value analytics and high percentage shots. Before Black Lives Matter started the tidal wave for self-love, appreciation, and “wokeness” amongst Black Americans in recent memory, Iverson was being himself, raw and uncut, before social media gave regular people platforms to do it themselves. And for that, he became a cultural giant.  

My take on Allen Iverson for this art show was to depict him as a Christ-like figure. Looking up somberly to the heavens with a crown of thorns on his head. In sports, we value rings culture so much, that we overlook those who gave it their all and fell short. The very thing that made him unique on the court, also doomed him from obtaining a championship: a franchise committed to building around a high volume shooting, six-foot guard. My contribution to the show is to say that he may not go down as an NBA champion, but his legacy paved the way for the current crop of NBA point guards to be more aggressors than facilitators. Out of the dress policy made to stymie who he knew himself to be, came the flowers that blossomed into a fashion renaissance that we see on Instagram today. They all look like him and play like him on the court, while following his blueprint of branded individuality off of it. Some may see the image and understand the theme, but as a practicing Catholic, I know others might consider it to be sacrilegious. I don’t care. It’s art.

There’s only so much of the status quo you can take before it becomes stale. And if you were in middle school/high school in the mid 90’s, and you were black, you saw the impact Iverson and Reebok had that was an alternative from the clean cut, corporate, bald head, hoop earring of Michael Jordan. It was raw. It was real.

I only hope my piece reflects that.

Come see the show, Saturday, March 2nd at the Speakeasy Bar in Oklahoma City. 7pm!

Love and Creative Briefs Tue, 12 Feb 19 11:03:07 -0600 Valentine’s Day is upon us and love is in the air!  It’s the perfect time of year to strengthen the relationships in our lives - and here at a creative agency, few relationships need as much attention as those between content strategists and graphic designers.

If you want your graphic designer to love you, don’t be like Mick Jagger.  I know what you’re thinking, “Don’t be like Mick Jagger? He’s got the moves, the charisma, everyone loves him!”  Well here’s the thing: he wrote the worst creative brief; maybe ever. Here it is. Give it a read. It won’t take too long.  

One of the most acclaimed artists of the 20th century, Andy Warhol, was given this barebones set of instructions to come up with album art for The Rolling Stones. The phrase “do whatever you want” is literally used.  I like to call that phrase, “How to Get Dumped by Your Graphic Designer.” Any designer worth wooing will tell you there is nowhere near enough information in this brief to make a deliverable the client will approve of.  So that begs the question, what do you need to include when writing a brief? Most importantly, you need balance. 

There’s certain details that are considered brief must-haves: dimensions, medium (i.e. Instagram ad, LinkedIn post), a general concept of color scheme or font style (for a pure graphic), the right Shutterstock search (for a photo), and the most refined version of any copy the graphic will have. However, you don’t want to put your designer in a box.  That’s where balance comes in. If you love someone, set them free. Especially the creatives. They need room to spread their wings and show you what they can do when given the freedom to create. “Do whatever you want” gives the designer freedom, but doesn’t show them that you care.

In short, if you want your designers to love you back, give them what they need, but don’t hold them down.  Let them be themselves, but make sure they know you’re there for them.

At the end of the day, despite Mick Jagger’s creative brief shortcomings, here’s what Warhol made: one of the most iconic album covers ever.

Happy Valentine's Day, XOXO liquidfish


Planning for the Unplanned Wed, 30 Jan 19 11:01:00 -0600 Schedules are hard.

Really hard.

The dictionary defines a schedule as "a series of things to be done or of events to occur at or during a particular time or period."

In my experience, a schedule is a list of items that you expect to do, but then everything goes haywire and you end up doing things that you either weren't expecting to do, hadn't planned on, or simply forgot.

Schedules are hard!

Missing plans, cancelling plans, forgetting that you were supposed to be the clown at your niece's 8th birthday party, and then remembering 6 months later.
It strains relationships.
It causes people to lose trust.
It could be why the last time you visited your sister, your niece kicked you in the shins.

I've been there, and those kicks hurt! Fortunately, this is shortly after the new year, and with new years come new resolutions! So throw away that plan to "eat heathier" or "exercise more" that you aren't going to keep, and resolve to plan smarter! Let's start with the basics:

Start With A Plan(ner)

The obvious first step is to set up a planner. If you don't already have one, a planner can help you keep in mind what's happening when, but a more likely issue is that you don't remember to write it down. Remembering to write things down can be hard, especially if you're already having a hard time remembering what you're supposed to write down to be remembering.

What can we do about that?

Well, there are any number of handy scheduling apps, calendar apps, and the like. With everyone having a portable planner anyways, it's easy to remember that when you set up that meeting over text, you can switch over to your calendar and set up a timeframe.

What about in-person plans?

These are a bit harder to remember, what with adorable little 7-year-old nieces running around and having tea parties. I've played around with methods and found that the easiest way to remember is to have your calendar on your home screen so that when you open your phone again, you're reminded to make a plan for whatever it was was scheduled. Repeat until second nature.

Buffer Your Plans With Plans

This is easier to grasp than to follow, and it has to do with planning an amount of time around your planned time. I've found that 30 minutes before and after a planned item is ideal. 1 hour meeting? plan for 2 hours. 30 minute lunch? plan for 90 minutes. The extra 60 minutes isn't a hard and fast rule, you can adjust to your liking or expectations. This isn't to extend your plans, but to make them more malleable. It's far easier to spend 90 minutes in a 60 minute meeting if you've already expected it to be 90 minutes. This will keep you from over extending your days and finding yourself unable to complete everything.

Make Time For You

This one is imperative and almost requires a separate article on it's own. It's of vital importance that you take some time out of every day to relax, unwind, and focus on things that make you happy. It can be early, with a small nap; It can be late, with a nice show; It can be sprinkled throughout the day, in the in-between time that you have from your buffers. Remembering that the reason you make these plans is because people like you and want to spend time around you, so it helps that you like you too.

As I wrap up, and get ready to head off to my niece's 9th birthday party in a clown outfit, I'd like to touch on a cancelling and rescheduling:

Rescheduling - Sometimes all of this planning amounts to having to push things back. It's no big failure to communicate that you need to pick a better day. The sooner you mention it, the easier it'll be to take.

Cancelling - It happens. Sometimes something comes up and you can't reschedule a plan. A birthday party comes to mind, but maybe you need to spend time with a friend who's grieving, or maybe work was just too much. It's better for yourself to admit when you're over extended.

These two things are unfortunate, but so necessary to keeping your schedule, and mind, stable.

Happy scheduling!

Title: ($dogs > $humans) ? true : "WRONG"; Fri, 18 Jan 19 10:52:23 -0600 Did you know spending quality time with your dog can help with stress, anxiety, depression, loneliness and other "down-and-out" feelings? They can also increase exercise and cardiovascular health when you play with them. The one thing that separates dogs from our other pets is the love they can give back to us.

Dogs learn how to be more human just how we learn to be more dog. If you have the joy of a cuddly pup in your life, you'll know what I'm talking about. We learn what the ticks and movements mean just how they learn our tone of voice, facial expressions and commands. With this type of communication, a bond between man and canine is unlike any other.

Studies have found that people with a canine companion tend to have lower stress, depression, blood pressure, cholesterol levels and even our average of doctor visits (by about 30%). All of this just from laying around with your best friend next to you. The benefits are extremely obvious for those living with some sort of disability or ailment. The training and skill set these pets have is truly remarkable. I know a few of my friends wouldn't have the same quality of life if it wasn't for their service pet.

With a strong animal bond, you can escape the realities and annoyances of humanity. I know my Rosie specifically likes watching soccer and listening to music. I know when she needs outside, is excited, anxious, hungry, sleepy, happy, scared, lonely...all of them. ROSIE LIVES!

Making a List You Can Check Twice: A Yule Log Wed, 19 Dec 18 15:59:44 -0600 Here at liquidfish, our developers are like a family. Like a family, we help each other out. In much the same way, developers the world over are one big extended family. Sure, we have our disagreements, we don't always see eye-to-eye, but for the most part we help each other. So when I needed to find a good way to make Laravel log to the database, I went looking for help. And wow but that extended family made it harder than it needed to be! Laravel is a very extensible framework; that's why we like it. It comes pre-built with logging tools, models, and plenty of other goodies to make a developer happy. Laravel uses Monolog for most of its logging needs, which can write to whatever channel you care to set up. So why is logging to the database not built in? I couldn't say, but it turns out it's actually super simple! No need for a store-bought package with gimmicky settings; the best gifts are often homemade.

#How To: The Meat of the Yuletide Feast

So, you're ready for the recipe for my super-simple database log?

You'll need: 1 Model class

1 AbstractProcessingHandler class

1 migration and a dash of setting spices.

The Model:

namespace App\Logging;
use Illuminate\Database\Eloquent\Model;
class Log extends Model
    protected $fillable = ['message','context','level','level_name','extra'];
    protected $casts = ['context'=>'array','extra'=>'array'];

The AbstractProcessingHandler:

namespace App\Logging;
use Monolog\Handler\AbstractProcessingHandler;
class MonoLogDBHandler extends AbstractProcessingHandler
    protected function write(array $record)
        $log = new Log($record);

The Migration:

Schema::create('logs', function (Blueprint $table) {

And finally, settings: In your `config/logging.php` you'll find a list of channels. Here's what our new db channel looks like:

'db' => [
    'driver' => 'monolog',
    'with'=>['level' => \Monolog\Logger::DEBUG],
    'handler' => \App\Logging\MonoLogDBHandler::class,

Now we can use the db channel for all our logging. To use it by default, we can either set the 'default' channel to db, or (my preference) use the 'stack' channel which logs to multiple channels and then add the db channel to the stack. Easy as mincemeat pie!

#Wrapping it up: Santa's Little Helpers

Now we just have to write to the log:

Log::info('Little Johnny',['Nice'=>true]);
Log::info('Cindy Lue Who',['Nice'=>true]);
Log::warning('The Grinch',['Nice'=>false]);

Now that's the makings of a happy yule log!

Oklahoma City's Design Community Mon, 10 Dec 18 15:25:23 -0600 I’ll be the first to admit it. I haven’t always been the best in supporting the design community in Oklahoma City. My participation comes in waves but I’m here to say that I’m back. In such a small city, each and every designer plays an important role in supporting our fellow designers and developers in town. As creatives, it’s also important for us to continue growing, learning, networking and spreading education of our profession around the city.

If you’re looking to be get involved, here are a few ideas:



If you’re looking for design specific organization, they have want you need: bi-monthly speaker events and monthly meetups. They’ve been responsible for bringing some prominent designers to the city: Aaron Draplin, Debbie Millman, Jay Schuster, House Industries, Ty Wilkins, Hoodzpah to name a few.


OKC Ad Club

The Addys are a staple in Oklahoma City design awards. But they do more than just hand out hardware. They put on monthly lunches with speakers that range from social media to copywriting.



Although their events has died down as of lately, they have established a great online community. Whether you’re looking for jobs, advice, or casual conversation from other creatives, there’s a channel for you.


Local Inspiration

Some great work getting out there, check it out.


Design Lunches

Lunch meetups around town for designers to talk shop or vent about clients. Actually, I haven’t seen one put on in a while. Anyone out there know? Hit me up, let’s start it up again.


The Holidays in OKC Fri, 16 Nov 18 09:30:09 -0600 With the holidays right around the corner and the cold weather upon us, the desire to get out and about is only going to shrink. If you are like me, the cold winter months with little sun can make me feel down. I try to make an assertive effort to combat those feelings by staying active and engaged with what is happening in my city and enjoy the many things going on.

If you are new to Oklahoma City or don’t travel downtown much, you might be surprised on the number of events and activities going on in the metro area. I am lucky to work and live near downtown, so I have been able to enjoy what it has to offer, and I wanted share some of the tips and tricks I have learned from attending a few of these events.

Devon Ice Rink

The ice rink opens on November 9th and runs through January 27th. If you plan on going with the family during the holiday season, make sure you know the holiday hours. Also, remember to take a pair of tall, warm socks!

Find out more about the Devon Ice Rink

Holiday Pop Up Shops in Midtown

The shops open the day after Thanksgiving and go through December 23rd, and a new set of local merchants are featured each week! This is a perfect solution for those who hate shopping at the mall, which can be especially stressful during the holiday season.

Read more about the Holiday Pop Up Shops in Midtown

Free Holiday Water Taxi Rides

The Bricktown canal is a spectacle filled with Christmas lights and is definitely worth checking out. It is a great (and free) event for a family night out or date night, but be sure to get in line early because it can be a long wait.

Find out more about the Bricktown Canal Holiday Water Taxi Rides

Opening Night

For those who don’t know what to do on New Year’s Eve, this is a great solution for the entire family or that special someone. Skip the line by pre-purchasing your wristband at participating 7-Eleven stores, MidFirst Bank locations, the Oklahoma City Museum of Art, and Plenty Mercantile.

Learn more about Opening Night

Now that you have learned a bit about the events going on in Oklahoma City during the holiday season, I challenge you to get out of the house, stop worrying about the holidays and enjoy yourself!

Find the full list of things to do in Oklahoma City around the holidays