liquidfish current en-US daily 1 The Finternship Fri, 23 Aug 19 11:56:21 -0500 We haven’t met, but I am Cameron and I’m the token summer “fintern” here at liquidfish. I like to keep things interesting, and it has been brought to my attention that liquidfish doesn’t award employee of the month. So, in an attempt to establish a new tradition, I went around our office and interviewed fellow fish who believe they have what it takes to be employee of the month. I selected a few candidates whom I interacted with during my time here.

Hannah Franklin: Executive Assistant 

“I deserve to be employee of the month because I keep our OG fishies alive and kickin'. I resurrect everyone from their morning sleepy eyes by making coffee every morning. I keep us stacked on snacks for those days when you've got the munchies. What more could you want in an employee amirite?”

Zena Cherian: Digital Marketing 

“Billy Shakespeare once said, "Some are born great, some achieve greatness, and some have greatness thrust upon them." I am the latter.”

Will Cobb: Social Media Specialist- 

“I deserve employee of the month for making sure the office emotional support animal makes it in every day.”

Lane Wheeler: Developer

“Me? Employee of the month? Absolutely not.”

After a close race and minutes of thoughtful consideration, I have come to a decision. Your inaugural “Employee of the Month” for August is, you guessed it, me. I can’t help but brag on this company I had the privilege of interning with, but I can help but brag on my accomplishments while I was here. To name a few... 

  • Brainstormed & planned campaigns 

  • Produced blog series for a client

  • Produced a contest for a client encouraging user-generated content

  • Starred in said client’s campaign ad…a story for another time

  • Extensive and thorough CMS work

  • Created content calendars and managed social accounts 

  • Saved the day with Chipotle

  • LAST BUT NOT LEAST ...made friends :)

On a sincere note, thank you, liquidfish, for a challenging, humbling and fulfilling summer. Thank you to the people here who welcomed me with open arms and sublime memes, you made this finternship a FUNternship. I will admit I was intimidated at first, but as author Brene Brown says, “Sometimes the bravest and most important thing you can do is just show up.” 

Thank you for providing the limitless tank of opportunity for a fish like me to show up and learn how to swim.

QUEST FOR THE BEST: TOP 5 TACOS IN OKC Fri, 19 Jul 19 14:04:46 -0500 Here at liquidfish, one of the things we like to do together as a team is eat. We eat during team meetings, we order out breakfast in honor of new-hire first days, and given the fact that we are located in Bricktown, OKC, we’re often prone to assemble a moderately sized group of co-workers for a walk to an impromptu lunch (weather permitting).

One day, the prospect of a Taco Tuesday lunch swept through the office in the late morning. In an attempt to discover which taqueria we would attend, we quickly realized that we couldn’t come up with a definitive answer immediately. Having lived in San Diego, CA for 12 years, I can attest that the taco shops there are as abundant as Mcdonald’s and Starbucks franchises. I also know that every San Diegan has their go-to, day-one shop of preference (mine was Silvas on Fletcher Blvd in El Cajon.. ask for the pollo asada fries). Understanding how taco shops breed fierce loyalty, I could tell that my co-workers had taquerias that were near and dear to their hearts as well. Since I am a relative outsider to the OKC taqueria game, I wanted to know which one was the best. And that’s how “Quest for the Best” (QFTB) was born.

How we narrowed down the field:

We gathered a list of every liquidfish member’s favorite taqueria in the OKC metro area (within a reasonable driving distance so that we wouldn’t be burning two hours on a lunch in Del City *sorry, Jessica*), and out of all of the options available, we made our tallies next to the ones we wanted to eat at the most. We counted those votes and chose the five with the highest point totals. Any commercialized/franchised spots were excluded (i.e. Fuzzy’s or Torchy’s). 

Scoring Criteria:

This was the tricky part. Since none of us are qualified enough to judge on an episode of Food Network’s “Chopped”, we had to come up with a system to rank the taquerias individually. The first idea was to introduce a control variable into everyone’s order: the carne asada taco. Order whatever you want, but save room for the taco to compare/contrast between taquerias. That Idea got shot down due to counterarguments like: “But what if you don’t want a carne asada taco?”, and “You can’t force everyone to eat the same thing if you’re asking them to pay for it”. OK, fine. Scratch that. 

Then an idea surfaced: What if we went to each taqueria, asked what their most popular item is, and just order that? That way, we could truly compare the best of the best! That one didn’t get off the ground either. Another suggestion: ordering your favorite thing at each one. “But what if you want tacos one week and a quesadilla the next?” We quickly learned that people don’t like being told what to do with their own money (who knew?) and scrapped all of it. It was impossible to narrow down a single control variable, so we just decided to let it be a free for all. Order whatever you want and give it a 1-10 ranking at the end of the meal. But you could vote based SOLELY on the food. No adding or docking points on the decor, atmosphere, location, music, etc. Just food.

So without further ado, here are the findings in our quest for the best taqueria in OKC. 

Tacos San Pedro

Tacos San Pedro was first on our list. This is Cody’s (the President and CEO), go-to spot in the city. When hosting out-of-town friends and clients to lunch, Tacos San Pedro is one of the places he takes them (so we kinda had to go, lol). Some of us have already been here in the past as a group with Cody, so we knew what was in store. This time, we came with the intent to critique —and it didn’t disappoint. Tacos San Pedro was one of the most picturesque, Instagram-ready food of the lot. The carne asada was flavorful and consistent. The horchata’s come in these big, thirst-quenching cups. And everything is just well prepared. I couldn’t remember a single complaint aside from Billy’s wife (but she’s not a liquidfisher).

During the NBA All-Star Weekend festivities, the penultimate event that has the sports world watching is the Slam Dunk Contest. As the contestants take center stage, they are announced in order from newcomer to the presumed winner. You can’t show the best dunker first. You don’t pull out your best dunks from the jump. That’s where I think Cody and Tacos San Pedros faltered. It might have suffered from being the first up, and people tempering ratings in anticipation of something greater down the line. 

If you drove out here for lunch, you will definitely want to come back again. 


El Dicho + La Loteria (Formerly Hugo’s on 23rd St.)

So there’s a story to this one. 

Largely everyone from liquidfish has been to Hugo’s on 23rd street at one point or another. As a design team, “Hugo Tuesdays” (a phrase I just made up) was a monthly staple. It was a place we were so familiar with, you could just say “Hugo’s”, and the caravan knows where to go. This time we said “Hugo’s”, and the caravan departed. I was in the first car that arrived and we were a bit confused because the signage was different. Our Executive Assistant Hannah was in the car as well. She gave us insight that the taqueria formerly known as Hugo’s has now split into two locations: “El Dicho” (where we currently were), and “La Loteria” (further down on 23rd street, and N. MacArthur). She knew this info because her parents live nearby and they used to frequent the old Hugo’s all the time. Hannah also stated that the old Hugo’s is now the new La Loteria. 

With this information in mind, we messaged the other cars in-route that we were going to La Loteria. As we get there, two other liquidfish cars arrive. While we are waiting for the final car to show before ordering, we find out that they are at El Dicho (the old location) already ordering. 

At La Loteria, we had a great time! I stepped outside of my comfort zone and ordered a chorizo quesadilla (which was fire), and people laughed and ate (always an indicator of good food). On the other hand, the people that went to El Dicho expected Hugo’s and didn’t quite get what they anticipated. The menu was different —which forced people out of their Hugo’s go-to’s—, and I think they simply weren’t ready for the change after thinking about Hugo’s all day. 

When the organizer and scheduler go to one spot, and the in-house photographer goes to the other, we didn’t have much of a choice but to combine the scores of both locations. Under normal circumstances, El Dicho would have the benefit of a clean slate, and La Loteria might’ve scored higher, but the scheduled show must go on! Like Tupac once said: “Hey, that’s the way it is…”


Big Truck Tacos

Of the five places chosen, I would think that Big Truck Tacos may be the most popular of the bunch. While Big Truck does Mexican street tacos, they specialize in a more non-traditional approach to Mexican taco ingredients and cuisine. If you do not frequent Big Truck often, the emphasis on the non-traditional might persuade you to steer away from the known, and into the unknown (as I did). 

I, personally, wasn’t exactly pumped with the chance that I took as I was eating it, but the beef al carbon (carne asada) taco that I ordered on the side was SLAMMIN’! The roasted slices of onions and poblano peppers set it off, too. The meat was cooked with just the right amount of slight char that gave it a memorable texture. Now that I think back to it, I should have ordered three more of those bad boys instead. I think that Big Truck’s carne asada taco might have been the best of any carne asada taco I had in the entire competition. 

But, as stated before, this is a contest that is judged by the totality of the menu. In an isolated visit, goat cheese and sautéed mushrooms in a taco are innovative, but under these circumstances, we’re comparing that to some of the OKC’s finest street tacos. On this day, in my burrito entrée selection, it wasn’t on par with what I was expecting to push for the top spot. And judging by the cumulative score of my work colleagues, I think they might be with me on this. 


Taqueria Rafita's

Truth be told, Taqueria Rafita’s didn’t make the cut before all of this started. The only reason why we went there was because of the original place (which shall remain nameless) not being able to accommodate seating for all of us. On top of that, it was outside, around noon-ish, in 90-degree heat, with no clouds in the sky (ain’t no damn way). We had to make an audible real quick. I remembered Taqueria Rafita’s from a bike ride I took with a friend once. We stopped for lunch and it was memorably good. I didn’t know the name of the place, so it didn’t ring a bell when it was up for voting. As we were mulling over our options, I asked about “that one place in the Paseo”, and the response I got was “that’s Taqueria Rafita’s!”

When we arrived, we arranged the tables into a long, ‘The Last Supper’ table where we sat on both sides. Curious about the 20-something odd people that came into her establishment as a single group, the owner approaches us. We tell her about our “Quest for the Best” contest, and she lights up with glee and gratitude that we chose to patronize her establishment. She proceeds to tell us about her backstory, and how the Taqueria has been kept in the family for several years while helping to put the kids of that family through school and financing emergency medical procedures. This woman was the American Dream personified. Not only did she provide us very tasty food, but it came wrapped in a heartwarming story.

The food was already killing it (in my personal opinion, the best yet), but this was the first place where we could eat an enchilada and know the story behind it. If you ate it and gave it a 7, you’d bump it up a point after listening to the owner’s genuine story. These were inspirational tacos. Granted, it may have affected our “just food” scoring system, but what do you do when you unintentionally combine triumph, perseverance, and stellar Mexican taco shop food?  


La Tropicana

Due to the efforts of our Director of Digital Marketing, Zena, La Tropicana is almost in the same headspace for us at liquidfish as Tacos San Pedros. The food is noticeably good, and the majority of us have been there at least once. This time, with Zena hyping up her go-to shop (and it being the last taqueria on the list) the pressure was considerably high for it to exceed expectations. La Tropicana did that and then some. 

La Tropicana’s clutch food in this pressure-packed moment was like watching a gymnast needing to hit a perfect 10 in the floor routine at the Summer Olympics —and then doing it. Where Taqueria Rafita benefited from a little emotional bump to get to the top of the leaderboard, Tropicana pulled it off strictly with food, no stories. Just a “how can I help you?” and boom, there’s your order. They weren’t messing around in there, AT. ALL. 

My only knock on LT: they are closed on Tuesdays. TACO Tuesdays. We should have docked a full point on principle alone. 

Full Disclosure: The story at Rafita’s got me. The food was great, but the story made my heart tender. I *wanted* it to take the crown, but La Tropicana ran away with this competition. It was the only taqueria to have an authentic “10”, and it had three of them. It also boasted seven “9’s”. Zena truly did save the best for last. 


In Conclusion

This was a very fun experience that we had as a collective office. I would recommend it to any workplace in the metro area to put forth a similar effort to boost morale and generate some extra money for some local businesses. We did the heavy lifting to show you the Best Taqueria of Oklahoma City, but we encourage you to visit all of the locations listed above to see for yourself. Did we miss any contenders? Shout them out for us in the comments so that we can try them out too.

In the meantime, stay tuned for a new liquidfish Top 5 quest shortly as we gear up for our next food category: pizza!

Cultivating Creativity Fri, 21 Jun 19 12:41:42 -0500 As an organization with core values such as ingenuity and solution-oriented, a considerable amount of creativity is required of liquidfish employees. Whether it is finding innovative ways to improve SEO on a website, design a logo for a new organization, or develop a novel solution for a mobile app, liquidfish employees are asked to look beyond what’s obvious and expected.

How does a team of twenty continue to find inspiration daily? It’s apparent that for many of the liquidfish team, Maya Angelou’s quote holds.

“You can’t use up creativity. The more you use, the more you have.”

Keeping the creative juices flowing isn’t about writing an impressive line of code or selecting the perfect hex value for a brand. Creativity is an essential fundamental component of the liquidfish team, both professionally and personally.


Playing music can impact many senses. There is a visual, tactile, and audio element to playing music. Several liquidfish team members play instruments and sing. Our fearless leader, Cody Blake, is a musical composer. PHP Developer, Lane Wheeler, and Executive Assistant Hannah Franklin have been known to host an impromptu concert at the coffee maker now and then.

Performance Art

There is no better way to step outside the box than by becoming a completely different character. Visual and video expert and noted nice guy, Logan Walcher, can use his tall stature and athletic ability to create visually appealing and hysterically entertaining Lucha Libre wrestling events and characters.

Drawing and Painting

Drawing and painting can be an effective way to decompress after a stressful day. Digital Marketing Director Zena Cherian uses art as a dual effort of stress relief and creative outlet.


Director of Development, Billy Davis, creates beautiful yet functional wood furniture pieces. He even brings those pieces to the office; most days, you can find him working diligently behind the standing desk he built himself.

There is no verified formula for cultivating an environment of creativity and innovation. However, by stretching our limits outside of the office, it creates the practice and space that allows us to develop innovative solutions inside the office.

Creating a Simple Laravel Docker Environment Wed, 05 Jun 19 12:04:22 -0500 Since I started at liquidfish a little over a year ago I was introduced to the wonderful world of Laravel Homestead. Laravel Homestead is great, it supports multiple php versions, xDebug, MySQL and allows the user to install any other required software straight into the virtual machine.

This is a tried and tested method of local development that has worked incredibly well for our development team for many years, but as we continue to grow we are always looking for ways to improve our development and deployment process, enter docker.

- Docker is a platform for developers and sysadmins to develop, deploy, and run applications with containers. -

Today we are going to create a dead simple docker environment to run Laravel 5.8 using PHP 7 with xDebug and Mysql on Windows 10 WSL.

My current development environment:

  • Windows Subsystem for Linux (WSL)

    • My main hard drive is mounted under /c/ in the WSL

  • ConEmu using ZSH hooked up to WSL

  • Docker for Windows

  • Docker & Docker Compose installed on WSL


As long as you have Docker & Docker Compose running on your machine you will be able to follow along with this tutorial. Let’s get started!

First create a new project folder for our docker environment, mine will be under /c/docker/laravel.

Now create the following directories inside the project folder

  • docker
    • nginx
    • php
      • config
  • www

The docker folder contains three folders, nginx, php, and mysql. These folders will hold Docker related files, Dockerfiles, .inis, configuration files, etc.

The www folder will be our environments web root where laravel will be installed.

First we are going to create a .env file in the root of the project to hold our Docker configuration variables that our compose file and PHP Docker file will use. The .env file will allow for easier flexibility in the event you need to change your environment.

# Docker Compose Environment Variables

# APP Environment

# PHP Version

# Local working directory webroot

# Remote working directory webroot


Now lets create our docker-compose.yml file inside the root of our project and add the following services to it.


version: "3"

 # App php-fpm service
     context: ./docker/php
       APP_ENV: ${APP_ENV}
   container_name: app
   restart: unless-stopped
     - ./docker/php/config/xdebug.ini:/usr/local/etc/php/conf.d/xdebug.ini
   env_file: .env
     - 9001:9001
     - app-network

   image: nginx:alpine
   container_name: nginx
   restart: unless-stopped
     - ./docker/nginx/nginx.conf:/etc/nginx/nginx.conf
     - ./docker/nginx/conf.d/:/etc/nginx/conf.d/
     - ./docker/nginx/ssl/:/etc/nginx/ssl/
     - 80:80
     - 443:443
     - app
     - app-network

   image: mysql:5.7
   container_name: database
     MYSQL_DATABASE: 'laravel'
     MYSQL_USER: 'user'
     MYSQL_PASSWORD: 'secret'
     MYSQL_ROOT_PASSWORD: 'password'
     - 3306:3306
     - database-volume:/var/lib/mysql
     - app
     - app-network

# Docker Volumes

# Docker Networks
   driver: bridge

Docker-compose.yml explanation

  • Version

    • The version of the docker-compose file

  • Services

    • Defines the services that will be ran on docker-compose up

  • App - PHP-FPM service

    • Build - We are going to be building a docker file

      • Context - The location of the docker file

      • Args - Variables we want to use in the Docker file from the .env

        • APP_ENV - Docker environment

        • PHP_VERSION - Version of php

        • REMOTE_WORKING_DIR - Remote working directory laravel will live

    • Container_name - Name of the container

    • Restart - Service will always restart unless it is stopped

    • Volumes - Local directories we want to mount and files we want to mount to the service

      • ./www (LOCAL_WORKING_DIR) to /var/www/html (REMOTE_WORKING_DIR)

      • Local xdebug ini file

    • Env_file - location of our env file

    • Ports - Ports we want to expose to the outside

      • 9001:9001

    • Networks - Internal app-network for inter-container communication

      • App-network

  • Nginx - Nginx service

    • Image - Nginx image we are using

      • Nginx:alpine

    • Container_name - name of container

      • Nginx

    • Restart - Service will always restart unless it is stopped

    • Volumes - Local directories we want to mount and files we want to mount to the service

      • ./www (LOCAL_WORKING_DIR) to /var/www/html (REMOTE_WORKING_DIR)

      • ./docker/nginx/nginx.conf:/etc/nginx/nginx.conf

        • Default nginx configuration file

      • ./docker/nginx/conf.d/:/etc/nginx/conf.d/

        • Nginx site configuration files

      • ./docker/nginx/ssl/:/etc/nginx/ssl/

        • SSL directory

    • Ports - Ports we want to expose to the outside

      • 80:80

      • 443:443

    • Depends_on - Nginx depends on the app service to start

    • Networks - Internal app-network for inter-container communication

      • App-network


  • Database - Mysql service

    • Image - Mysql image we are using

      • Mysql:5.7

    • Container_name - name of container

    • Enviroment - Environment variables the image will use

      • MYSQL_DATABASE: 'laravel'

      • MYSQL_USER: 'user'

      • MYSQL_PASSWORD: 'secret'

      • MYSQL_ROOT_PASSWORD: 'password'

    • Ports - Ports we want to expose to the outside

      • 3306:3306

    • Volumes - The data volume for mysql, this will allow data to persist.

      • database-volume:/var/lib/mysql

    • Depends_on - database service depends on the app service to start

    • Networks - Internal app-network for inter-container communication

      • App-network

  • Volumes: - Volumes to create
    Database-volume: - Mysql data volume

  • Networks - Networks our services are using

    • App-network - name of network

      • Driver bridge

        • Bridge network is used for inter-container communication

Next up is creating our php Dockerfile


# PHP Version environment variable

# PHP Version alpine image to install based on the PHP_VERSION environment variable
FROM php:$PHP_VERSION-fpm-alpine

# Application environment variable

# Remote working directory environment variable

# Install Additional dependencies
RUN apk update && apk add --no-cache $PHPIZE_DEPS \
   build-base shadow nano curl gcc git bash \
   php7 \
   php7-fpm \
   php7-common \
   php7-pdo \
   php7-pdo_mysql \
   php7-mysqli \
   php7-mcrypt \
   php7-mbstring \
   php7-xml \
   php7-openssl \
   php7-json \
   php7-phar \
   php7-zip \
   php7-gd \
   php7-dom \
   php7-session \

# Install extensions
RUN docker-php-ext-install pdo pdo_mysql
RUN docker-php-ext-enable pdo_mysql

# install xdebug and enable it if the development environment is local
RUN if [ $APP_ENV = "local" ]; then \
   pecl install xdebug; \
   docker-php-ext-enable xdebug; \

# Install PHP Composer
RUN curl -sS | php -- --install-dir=/usr/local/bin --filename=composer

# Remove Cache
RUN rm -rf /var/cache/apk/*

# Add UID '1000' to www-data
RUN apk add shadow && usermod -u 1000 www-data && groupmod -g 1000 www-data

# Copy existing application directory permissions
COPY --chown=www-data:www-data . $REMOTE_WORKING_DIR

# Change current user to www
USER www-data

# Expose port 9000 and start php-fpm server

# Run php-fpm
CMD ["php-fpm"]


Dockerfile Explanation



    • sets the arguments supplied in the docker-comopose.yml for use within the Dockerfile

  • FROM php:$PHP_VERSION-fpm-alpine

    • is installing the php-fpm alpine image supplied through the variable

      • Alpine Linux is an independent, non-commercial, general purpose Linux distribution designed for power users who appreciate security, simplicity and resource efficiency. -

      • Using Alpine Linux will help keep our image sizes down

  • RUN apk update

    • Is updating Alpine Linux and then adding our dependencies, if you need additional php extensions or linux packages this is where you would add them.

  • RUN docker-php-ext install and enable

    • install the php extension into docker and then enable it so we can edit the configuration of these extensions. For the sake of simplicity I am only install pdo. If you need a more robust configuration you would install and enable the extension here.

  • RUN if [ $APP_ENV = “local” ]

    • we are only installing xdebug if we are on a local development environment, there is no need to install it on staging or production.

  • RUN curl -sS composer

    • We are installing composer for laravel

  • RUN rm -rf /var/cache/apk/*

    • we are cleaning up the apk cache

  • RUN apk add shadow && usermod

    • we are adding the UID 1000 to www-data for php-fpm

  • COPY --chown=www-data

    • we are copying the application directory permissions

  • USER www-data

    • changing the current user to www

  • EXPOSE 9000

    • exposing port 9000 for internal use within the containers

  • CMD [“php-fpm”]

    • start php-fpm

Now lets create our default configuration files

Create a nginx.conf file in the root of our docker/nginx folder with the following contents.

pid /run/;
worker_processes auto;
worker_rlimit_nofile 65535;

events {
multi_accept on;
worker_connections 65535;

http {
charset utf-8;
sendfile on;
tcp_nopush on;
tcp_nodelay on;
server_tokens off;
log_not_found off;
types_hash_max_size 2048;
client_max_body_size 16M;

include mime.types;
default_type application/octet-stream;

# logging
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log warn;

ssl_session_timeout 1d;
ssl_session_cache shared:SSL:50m;
ssl_session_tickets off;

# Diffie-Hellman parameter for DHE ciphersuites
ssl_dhparam /etc/nginx/dhparam.pem;

# OWASP B (Broad Compatibility) configuration
ssl_protocols TLSv1.2 TLSv1.3;
ssl_prefer_server_ciphers on;

# OCSP Stapling
ssl_stapling on;
ssl_stapling_verify on;
resolver valid=60s;
resolver_timeout 2s;

# load configs
include /etc/nginx/conf.d/*.conf;

Create a site.conf file in the root of our docker/nginx/conf.d folder with the following contents. This is a standard laravel nginx config.

server {

   listen 80;
   listen [::]:80;

   # For https
   # listen 443 ssl;
   # listen [::]:443 ssl ipv6only=on;
   # ssl_certificate /etc/nginx/ssl/default.crt;
   # ssl_certificate_key /etc/nginx/ssl/default.key;

   server_name test.local;
   root /var/www/html/public;
   index index.php index.html index.htm;

   location / {
        try_files $uri $uri/ /index.php$is_args$args;

   location ~ \.php$ {
       try_files $uri /index.php =404;
       # We are using our app service container name instead of as our connection
       fastcgi_pass app:9000;
       fastcgi_index index.php;
       fastcgi_buffers 16 16k;
       fastcgi_buffer_size 32k;
       fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
       #fixes timeouts
       fastcgi_read_timeout 600;
       include fastcgi_params;

   location ~ /\.ht {
       deny all;

   location /.well-known/acme-challenge/ {
       root /var/www/letsencrypt/;
       log_not_found off;

   error_log /var/log/nginx/laravel_error.log;
   access_log /var/log/nginx/laravel_access.log;

I am using test.local as my local domain, be sure to add test.local (or your preferred local domain) to your hosts file.

Create a xdebug.ini file in the root of our docker/php/config folder with the following contents.


The big takeaway with this xdebug configuration is setting connect_back to 0, setting your idekey to PHPSTORM, and setting your hoist to host.docker.internal. I ran into issues with enabling setting remote_connect_back instead of using the remote_host. This has been tested on Windows 10 WSL and has been working really well. If you can’t get xdebug to work with that configuration you may need to search around to find one that will work for you.

Lets build it

To build the environment make sure you are in your projects root folder and run docker-compose build and let it run. When it is done you should  see the following output.

Now that PHP is built lets run docker-compose up -d this command starts all of the containers in detached mode, if you want to be able to run the containers while looking at the output logs just run docker-compose up. If you are in detached mode and want to view the logs for a specific container you can run the following command, docker-compose logs app (container name.)

Installing Laravel

Now that are containers are running lets install laravel, run the following command to jump into the container docker-compose exec app bash.

Now that we are in the container run the following command to install laravel composer create-project --prefer-dist laravel/laravel . Setting the directory to the (.) just means we are going to install laravel inside the current directory.

Great Laravel is installed, now open up test.local (or the domain you set in the nginx config / hosts file) and you will see the Laravel welcome screen!

Setting the laravel mysql environment variables

Just like in the Nginx config where we set the fastcgi_pass to app:9000 we are going to do a similar thing to the mysql environment variables.


Now that we have the environment variables set lets exec back into our container (docker-compose exec app bash) and run the default migrations (php artisan migrate)

Creating a PHP Remote Debug configuration in PHPSTORM

To enable use xdebug in PHPSTORM follow these steps

  • Top right of PHPSTORM there is a drop down, click add or edit configuration

  • Set the name (Can be anything)

  • Select Filter debug connection by IDE key


  • Click the servers button to the right of the server dropdown

  • Set the name of the server (Can be anything)

  • Set the host to your local site url (mine is test.local) Port 80, Debugger Xdebug

  • Open the File/Directory selection until you see the www folder

  • On the right where it says absolute path on the server click to edit it

  • Enter /var/www/html which is the remote path and hit enter on the keyboard to save it

  • Apply the changes then hit ok on the servers menu and then the debug menu

To test this go into the web.php routes file and add the following code

Route::get('/', function () {
   $word = 'Debugger';

   $word .= ' In Action';
   echo $word;

Break on the first $word variable and you will now be able to step through the code!

Sweet! Docker is setup, Laravel is installed and configured, our migrations have been ran, and our xDebug configuration for PHPSTORM has been setup!

Now all that's left to do is get building!

All Hail the Office Alternate Tue, 14 May 19 10:22:29 -0500 When you hear the title “Executive Assistant,” you probably think coffee, cleaning, sticky notes, and some sort of bowl with goodies in it. If we’re being honest, you’re not wrong. Being an assistant DOES include all of these things. What people don’t realize, however, is how much more the job title actually consists of.

To better explain what it truly means to be an assistant, I feel as though I must turn to the world of sports. We all know the term “alternate,” but unless you’ve been one, you probably don’t understand how important their role is to the success of the team. 

As an alternate on a sports team, you have to be ready to jump in anywhere, at any time, for any team member. In an office setting as an assistant, it’s not much different.  You jump in whenever, wherever, for whomever, and you get it done (even if you have no idea what you’re doing). If something needs to be scanned, you scan it. If an invoice needs to be paid, you pay it. If the 15 grown men you work with need snacks and candy to function, you go get them snacks and candy.  If the walls need to be painted, you paint them (even in your vintage leather boots).

We’re VIP members at Sam’s Club, Amazon Prime experts, and the only people without a barista title that can sling coffee like no one’s business.

As assistants, “coming through in the clutch” is literally our job. We fill in the holes where something is missing. We’re the rookies that make the buzzer beater at the end of the game to get that W.

Whether we’re organizing, paying people, buying snacks, or wiping down the conference table for a last minute meeting, we’re always striving to make everyone’s life in the office a little easier. 

If you’re in an office setting with an assistant today, give them a little extra love. If you’re an assistant yourself, keep doing your thing and assisting your coworkers in getting that daily W.

Battle Stations Fri, 12 Apr 19 15:59:21 -0500 When liquidfish hears, "man your battle stations"  this is what we see...

Agile is Cool! Wed, 27 Mar 19 16:48:40 -0500 Many of you looking at this title might think,

“Just why is Agile cool, and what is Agile anyway?”

Here at liquidfish, we do a lot of amazing work including branding, marketing, SEO, and software development. Today I am going to talk about the latter. Some people think that software development is a completely predictable process, but this is a misconception. Let me explain.

When a client asks for an application (website, mobile, etc.), usually there is a big, exciting vision. In an ideal world, software developers would spend significant time breaking down all the pieces, documenting each details and requirements, writing a plan, and finally, by using previous experience, estimate and predict total hours and cost for each project. But to write down everything could take forever. Clients usually want to know how much a project will cost and how long it will take to produce a product before paying for anything, or entering a contract.  As you see, here we have a vicious cycle. Clients prefer to have promises before paying. Developers would like details and data before making promises. Someone needs to step out of the circle.

This is usually the project manager’s job—breaking projects down into big pieces and providing rough time estimations. Because of our team’s great experience, we do a good job of providing realistic quotes to clients.

This is a traditional process and is well known in the IT industry as the Waterfall approach (

Using this technique, we provide an estimated total cost for the project and establish the deadline for when it will be delivered. Waterfall monoliths can be ideal for short-term projects that require a strict deadline and budget. The liquidfish team is committed to being overachievers and keeping processes simple for our clients.

At this point I imagine you have a question:  “So, what is the problem? And still you have not answered: What is Agile and why is it so cool?” We will get there, I promise. First, I want to show the potential downside of this traditional process from a coder/developer’s perspective.

When Waterfall is not cool.

As our skills and expertise have grown exponentially, bigger fish with more complex projects are swimming to our current. Complex projects require many months of continuous coding, often without feedback, as clients await the final, finished product. The goal of most projects is nicely described and packaged in the beginning by project manager and clients alike. But while they look nice, these descriptions can be fraught with peril for a coder. Lots of positive emotions at project inception can overshadow and obscure issues that most of the time will not be clear at the initial planning stage. It’s no one’s fault; it’s just one of many steps of the process. And it’s normal for multiple people to have different interpretations of information. As I’m sure you can understand, if there is a misunderstanding during the initial time-estimation step, this can bring unhappiness later for both coders and clients.

Another challenge with the Waterfall approach is the potential lack of clear understanding on the coders’ part regarding the details of clients’ specific business processes. To successfully automate a process, we must know all the rules. Software requires precise checks on all conditions to make a right decision (output). Usually people don’t think in this way, and they don’t need to. In real life, the business process is easily executed heuristically. But for a program, every exception and detail must be accounted for. Of course, if detailed and unique business processes are fully known up front for each client, it can make accurate estimation a lot easier. However, this is rarely, if ever, the case. And even if it were, the understanding or desires can be changed without the coders’ knowledge.

A lot of times, our project is dependent on other services. Those services can significantly change the way our project works, or change pricing, as we would need to change our programming in a way that can interact with them. Other times, for various reasons, the multiple services are incompatible, or create environments that are completely unpredictable. We can’t remove this factor. The biggest misconception here is that software development is just about inputting a code that functions. A big chunk of the mission is to clearly understand the exact “whats and whys” of what will be written. Only then can top-notch code be efficiently designed and built. To change something after the whole project is done can be very expensive. Monolithic waterfall processes can result in wasted money and time, both for a designer and for a client. And this is a big risk that I will talk about later.

Summarizing this, we can highlight four big challenges with Waterfall approaches:

●    Unclear global goals that may change over periods of time

●    Misunderstanding of needs with lack of direct contact between coders and clients

●    Lack of knowledge of customers’ unique business logic

●    Dependencies on third-party applications

So, what is Agile?

I think you have guessed that I will propose it’s a solution that can cover these challenges. Agile is an approach/methodology that eliminates the pattern of a waterfall monolith. Many different approaches fall under the Agile umbrella, but it has a lot to do with providing micro-services throughout a project. There is even an Agile manifesto ( It’s cool, right? :)

An Agile approach has different focus points:  Scope changes and “fast failure” are encouraged. Any scope changes are processed as a natural part of development. Understanding how the process is envisioned to work can be changed with new insights or even because of some external factors (economic, political). Feedback is required at regular intervals. Feedback of real users can bring a lot of helpful and important changes in understanding. As a developer, I can say it’s always easier to make than to change something that was done a long time ago. The purpose of failing fast is to get fast feedback while correction is most efficient. Regularly presenting a demo of current status to product owners and business stakeholders helps them to make sure it is as requested, and even more important, to determine if progress is indeed in line with what they really want.

Agile processes involve iterative development. There will be many small steps toward the completion of a colossal project. Customers don’t need to wait for the totality of all massive documents to be finished for review. Finalizing of requirements can be done as the projects go along. There is a huge benefit of being able to start with tests as soon as possible. The ability to respond to feedback and adapt immediately saves everyone time and effort. This is also a benefit because final documents are usually burdensome to read anyway (if it’s not massive, then it usually doesn't include everything, resulting in a big risk of misinterpreting the requirements).

With Waterfall, there is a high upside prospective for continued business coupled with a high downside risk that can ultimately affect costs. Agile procedures remove the risk of unknown costs and delay by encouraging the principle of failing fast to allow for immediate adaptation in cost, scope, and timeframe. The idea of it is to reduce downside risk while keeping upside prospective.

Iterative development can be a little bit tricky, of course. Projects will be divided into pieces and will be complicated with various login screens, lists of users, downloadable documents throughout the process, etc. But developers can implement access levels piece by piece incrementally. Regular feedback is key. This is what makes the process iterative. Otherwise there is the risk of creating multiple mini-waterfall monoliths. Just doing sprints or iterations doesn’t make for Agile development. It’s the attention and adaptation to feedback that breeds efficiency.

Agile development allows for adaptive reprioritization in what can bring the most value to our clients. Iterative development allows our clients to receive a product before it is finished. Sub-portions of final products can be pushed to production and get tested by users, or at least run by higher managers. This would allow midstream corrections from those people whose feedback is the most valuable.

To summarize, I want to repeat the key principles of Agile:

●    Highest priority is to satisfy customer needs rather than quickly complete a project

●    Coders welcome changes early on and throughout the project

●    Frequent feedback from end-users improves and transforms project parameters

●    Iterative development is encouraged with fully transparent costs for changes

And that, dear readers, is why this software developer thinks Agile is cool! It’s fine to disagree. There are valid arguments and feedback on all sides. I’d love to hear yours.

Client Side Search with Fuse.js Thu, 14 Mar 19 14:27:02 -0500 Search is arguably one of the most important elements of a successful website. After all, it doesn't matter if the information is there if you can't find it.  Often, it is useful to implement several search strategies on a site, each of which can be optimized for the specific data being searched. Today, I would like to talk about fuse.js, a nifty little fuzzy search utility that can be easily incorporated into any site.

Fuse.js is a lightweight javascript fuzzy search library written and maintained by Kiro Risk.  As of this time, the current version 3.4.4 weighs in at a mere 10.9 KB. Fuse.js, like many of the best dev tools, is open source, released under the Apache License Version 2.0. In addition to it's small size, fuse.js also has the advantage that it has no external dependencies.

In a nutshell, fuse.js takes as its input a set of data in json format, some configuration information (also json formatted) and a search string.  To be more precise, an instance of the fuse class is instantiated with the input data and config, then the search method of that instance is called with the query as its argument.  The input should be in the form of an array of identically structured json objects since we will be telling fuse.js which parts of each object to consider in the search as a part of the configuration. Fuse.js returns an array of matches, filtered by the query string in accordance with the search settings.  These output results can be the full original objects, or if the records have a unique key, this can be identified in the search configuration and the results can simply be an array of these ids.

To configure fuse.js, you pass in an configuration object when you create your fuse.js instance.  Although there is not a huge number of configuration options, there are enough to tune the search in several ways.  Some options such as "caseSensitive", "tokenize", "location", "threshold" and "keys" allow you to specify exactly how the data will be searched.  These determine what will be taken into consideration when deciding which records are matches. Other configuration options such as "includeScore", "includeMatches" and sort affect the format of the output.  I won't go too deep into the meaning of each of these parameters, but will note that the main parameters determining the "fuzziness" of the search are "location", "distance" and "threshold".

The fuse.js homepage explains all of the parameters in greater detail, but even better, it allows you experiment with them live.  You can either use the sample data included in the page, a list of book title and authors, or paste in your own json formatted data.  As you change the selected configuration options, two output windows are updated in real time, one showing the javascript required to instantiate and run the query and the other showing the search results.  This provides a quick method for experimenting with the various configuration parameters and when you have the parameters tweaked to you liking, you can just copy and paste the configuration block into your code.  

Fuse.js might not make sense for every situation, but can be ideal depending on your search requirements.  I am currently using it in what is essentially a single page app within a larger website. Each user has their own data and the number of records per user is relatively small (currently the maximum size is under 400 records) so fuse.js is more than adequate for my purposes.  What's more, the data that I needed to search was already being loaded in the page as it was using Vue.js to generate the page. Previously, the same search was using Algolia which is a great service, but was overkill for this particular use case. With few records per user, but thousands of total users, the Algolia index was huge and growing steadily.  This, coupled with the fact that this was a convenience feature that not all users needed or used prompted me to seek an alternative and fuse.js was the perfect fit.

Fuse.js worked so well for me that I never tried any of the other javascript search libraries that are out there, but if you find that fuse.js doesn't quite cut it for you, you might give one of the the following a try:  js-search ( by Brian Vaughn or fuzzy ( by Matt York.

How Allen Iverson Convinced Me to Join an Art Show Wed, 27 Feb 19 14:45:58 -0600 In the previous year, I had some monumental changes in my life. I settled into a new job, and I found a groove with my babies that helped to ease the newness of fatherhood. After the initial shock to the system from all the change, I found myself thinking about the stagnation in my personal growth. As January 1st, 2019 came and went, I began conceptualizing ways to push myself. God (or “The Universe”) heard my inner plea, and decided to test my willingness to try something I’ve never done before; and I accepted.

A mutual friend on Facebook posted about an art show he was hosting that needed artists to submit work. I have been to a couple of his shows in the past (generally a mash up of different pop culture themes) and I’ve always left with the question: “If I were to do this, what would I do?” This time, the topic of the show —NBA legend, Allen Iverson— was enough for me to give it a shot. 

“This Bronze is Worth More Than Gold: An Art Show for Allen Iverson and All Unsung Heroes”, is a show intended to “highlight overlooked greatness. This includes athletes who never won rings, directors who never won Oscars, presidential candidates that came up short and whatever else our city's talented artists can come up with.” To me, there was no doubt on what I wanted to tackle. Allen Iverson was one of the first black athletes who was unapologetically black. Michael Jordan, Eddie Murphy, and Michael Jackson spent the 80’s proving that black people could profit millions in the sports and entertainment industry. They were phenoms that alerted Corporate America that black artists and entertainers could in fact push pop culture. Even though they were black icons, they were polished, branded entities. None of them totally embraced the culture and the essence of blackness as it stood. The 1990’s broadened that scope for more African Americans to enter the superstar stratosphere behind them, but it was a time where if you wanted to be the face of a brand, and you had a black face, you had to be clean cut and media savvy. Non-threatening towards a white audience. Iverson was not that in the slightest.

From Hampton, Virginia, Iverson was an east coast guy. At the time, east coast rap and style dominated the music scene. He was drafted in 1996 to the Philadelphia 76ers, a franchise whose fans are blue collar and traditionally known for supporting the athletes that played hard with their city’s name on their chest. He was the dubbed as the “anti-hero” during the final stretch of Jordan’s second 3-peat with the Chicago Bulls. Iverson challenged the establishment on the court as much as he did off of it. At his peak, his street style was in full display as he accepted the MVP trophy in 2001 in baggy sweats and a du-rag. That moment of unmitigated blackness prompted the league commissioner to mandate a dress policy to suppress a style that had already permeated through a league that was 80% black, in attempts to make his athletes appear more “professional”. Before Iverson; only rappers of the thuggish-ruggish variety wore cornrows. After Iverson; suburban, middle class black kids were wearing cornrows in their hair. Even some white kids had it too. He was the first in the NBA to go all out tattooed. He was the first to debut a shooter sleeve and 3/4 to full length compression tights (a style now commonplace in the NBA and the blacktop). He was the first in line of many shoot-first point guards who’s strength relied in his ball handling, quickness, and midrange shooting, in a league that would soon value analytics and high percentage shots. Before Black Lives Matter started the tidal wave for self-love, appreciation, and “wokeness” amongst Black Americans in recent memory, Iverson was being himself, raw and uncut, before social media gave regular people platforms to do it themselves. And for that, he became a cultural giant.  

My take on Allen Iverson for this art show was to depict him as a Christ-like figure. Looking up somberly to the heavens with a crown of thorns on his head. In sports, we value rings culture so much, that we overlook those who gave it their all and fell short. The very thing that made him unique on the court, also doomed him from obtaining a championship: a franchise committed to building around a high volume shooting, six-foot guard. My contribution to the show is to say that he may not go down as an NBA champion, but his legacy paved the way for the current crop of NBA point guards to be more aggressors than facilitators. Out of the dress policy made to stymie who he knew himself to be, came the flowers that blossomed into a fashion renaissance that we see on Instagram today. They all look like him and play like him on the court, while following his blueprint of branded individuality off of it. Some may see the image and understand the theme, but as a practicing Catholic, I know others might consider it to be sacrilegious. I don’t care. It’s art.

There’s only so much of the status quo you can take before it becomes stale. And if you were in middle school/high school in the mid 90’s, and you were black, you saw the impact Iverson and Reebok had that was an alternative from the clean cut, corporate, bald head, hoop earring of Michael Jordan. It was raw. It was real.

I only hope my piece reflects that.

Come see the show, Saturday, March 2nd at the Speakeasy Bar in Oklahoma City. 7pm!

Love and Creative Briefs Tue, 12 Feb 19 11:03:07 -0600 Valentine’s Day is upon us and love is in the air!  It’s the perfect time of year to strengthen the relationships in our lives - and here at a creative agency, few relationships need as much attention as those between content strategists and graphic designers.

If you want your graphic designer to love you, don’t be like Mick Jagger.  I know what you’re thinking, “Don’t be like Mick Jagger? He’s got the moves, the charisma, everyone loves him!”  Well here’s the thing: he wrote the worst creative brief; maybe ever. Here it is. Give it a read. It won’t take too long.  

One of the most acclaimed artists of the 20th century, Andy Warhol, was given this barebones set of instructions to come up with album art for The Rolling Stones. The phrase “do whatever you want” is literally used.  I like to call that phrase, “How to Get Dumped by Your Graphic Designer.” Any designer worth wooing will tell you there is nowhere near enough information in this brief to make a deliverable the client will approve of.  So that begs the question, what do you need to include when writing a brief? Most importantly, you need balance. 

There’s certain details that are considered brief must-haves: dimensions, medium (i.e. Instagram ad, LinkedIn post), a general concept of color scheme or font style (for a pure graphic), the right Shutterstock search (for a photo), and the most refined version of any copy the graphic will have. However, you don’t want to put your designer in a box.  That’s where balance comes in. If you love someone, set them free. Especially the creatives. They need room to spread their wings and show you what they can do when given the freedom to create. “Do whatever you want” gives the designer freedom, but doesn’t show them that you care.

In short, if you want your designers to love you back, give them what they need, but don’t hold them down.  Let them be themselves, but make sure they know you’re there for them.

At the end of the day, despite Mick Jagger’s creative brief shortcomings, here’s what Warhol made: one of the most iconic album covers ever.

Happy Valentine's Day, XOXO liquidfish


Planning for the Unplanned Wed, 30 Jan 19 11:01:00 -0600 Schedules are hard.

Really hard.

The dictionary defines a schedule as "a series of things to be done or of events to occur at or during a particular time or period."

In my experience, a schedule is a list of items that you expect to do, but then everything goes haywire and you end up doing things that you either weren't expecting to do, hadn't planned on, or simply forgot.

Schedules are hard!

Missing plans, cancelling plans, forgetting that you were supposed to be the clown at your niece's 8th birthday party, and then remembering 6 months later.
It strains relationships.
It causes people to lose trust.
It could be why the last time you visited your sister, your niece kicked you in the shins.

I've been there, and those kicks hurt! Fortunately, this is shortly after the new year, and with new years come new resolutions! So throw away that plan to "eat heathier" or "exercise more" that you aren't going to keep, and resolve to plan smarter! Let's start with the basics:

Start With A Plan(ner)

The obvious first step is to set up a planner. If you don't already have one, a planner can help you keep in mind what's happening when, but a more likely issue is that you don't remember to write it down. Remembering to write things down can be hard, especially if you're already having a hard time remembering what you're supposed to write down to be remembering.

What can we do about that?

Well, there are any number of handy scheduling apps, calendar apps, and the like. With everyone having a portable planner anyways, it's easy to remember that when you set up that meeting over text, you can switch over to your calendar and set up a timeframe.

What about in-person plans?

These are a bit harder to remember, what with adorable little 7-year-old nieces running around and having tea parties. I've played around with methods and found that the easiest way to remember is to have your calendar on your home screen so that when you open your phone again, you're reminded to make a plan for whatever it was was scheduled. Repeat until second nature.

Buffer Your Plans With Plans

This is easier to grasp than to follow, and it has to do with planning an amount of time around your planned time. I've found that 30 minutes before and after a planned item is ideal. 1 hour meeting? plan for 2 hours. 30 minute lunch? plan for 90 minutes. The extra 60 minutes isn't a hard and fast rule, you can adjust to your liking or expectations. This isn't to extend your plans, but to make them more malleable. It's far easier to spend 90 minutes in a 60 minute meeting if you've already expected it to be 90 minutes. This will keep you from over extending your days and finding yourself unable to complete everything.

Make Time For You

This one is imperative and almost requires a separate article on it's own. It's of vital importance that you take some time out of every day to relax, unwind, and focus on things that make you happy. It can be early, with a small nap; It can be late, with a nice show; It can be sprinkled throughout the day, in the in-between time that you have from your buffers. Remembering that the reason you make these plans is because people like you and want to spend time around you, so it helps that you like you too.

As I wrap up, and get ready to head off to my niece's 9th birthday party in a clown outfit, I'd like to touch on a cancelling and rescheduling:

Rescheduling - Sometimes all of this planning amounts to having to push things back. It's no big failure to communicate that you need to pick a better day. The sooner you mention it, the easier it'll be to take.

Cancelling - It happens. Sometimes something comes up and you can't reschedule a plan. A birthday party comes to mind, but maybe you need to spend time with a friend who's grieving, or maybe work was just too much. It's better for yourself to admit when you're over extended.

These two things are unfortunate, but so necessary to keeping your schedule, and mind, stable.

Happy scheduling!

Title: ($dogs > $humans) ? true : "WRONG"; Fri, 18 Jan 19 10:52:23 -0600 Did you know spending quality time with your dog can help with stress, anxiety, depression, loneliness and other "down-and-out" feelings? They can also increase exercise and cardiovascular health when you play with them. The one thing that separates dogs from our other pets is the love they can give back to us.

Dogs learn how to be more human just how we learn to be more dog. If you have the joy of a cuddly pup in your life, you'll know what I'm talking about. We learn what the ticks and movements mean just how they learn our tone of voice, facial expressions and commands. With this type of communication, a bond between man and canine is unlike any other.

Studies have found that people with a canine companion tend to have lower stress, depression, blood pressure, cholesterol levels and even our average of doctor visits (by about 30%). All of this just from laying around with your best friend next to you. The benefits are extremely obvious for those living with some sort of disability or ailment. The training and skill set these pets have is truly remarkable. I know a few of my friends wouldn't have the same quality of life if it wasn't for their service pet.

With a strong animal bond, you can escape the realities and annoyances of humanity. I know my Rosie specifically likes watching soccer and listening to music. I know when she needs outside, is excited, anxious, hungry, sleepy, happy, scared, lonely...all of them. ROSIE LIVES!