Under the Hood

I frequently look at my browser’s developer tools to see what’s going on when the website I’m on is slow:

  • Multiple analytics referenced, some from companies that are no longer around
  • JavaScript errors galore
  • Huge files being transferred for no reason
  • Time outs, Time outs everywhere

There are some easy fixes that can be done to speed up your web application, API’s, or even your desktop applications. 

Today’s business solutions consist of multiple layers to simply make data look pretty.  Unfortunately, finding poor performing code can happen at any of those layers.  Even users using an intranet application are expecting google like speed instead of accepting an application’s functionality to load in minutes.

Database

“No matter how awesome your code is, if your company’s database is slow, then your solution will run like a slug”. 

I had a previous project when working at Moser where a web application needed a few features added in.  There was a development database with the main table having around 25,000 rows.  This was my first task at this client, so I wanted to do a good job.  I wrote some code in a few days and I was happy.  The web application was fast, and nothing was broken.

The next step was to promote my changes to a production environment.  The release was done over the weekend, and when I went through to make sure my changes were working, I was stunned.  I did not have access to the production environment prior to this release, but simple search results returned in minutes instead of milliseconds.  The reason?  There were well over 50 million main records.  Yikes!

After talking to the client, they agreed that this was an issue and hours were wasted daily when users tried to use the application.

 

With today’s database tools, it’s easier to isolate problem queries.

  • Have a scheduled maintenance period to keep your data up to date.
    • With frequently changing data, it’s important to rebuild indexes and statistics. When new columns and tables are added, make sure those indexes are part of your maintenance routines.
  • Consider caching data in join statements that is a small dataset and data that rarely changes
    • A great example of this would be a table with States (Alaska, Alabama…)
    • Create a temporary table with this data, and reference keys directly instead of thousands of reads to the same data.
  • Use database tools to help find bottlenecks and poor performing relationships
    • Look for high number of reads when querying or joining data. These are great examples of where an index can be added
  • De-Normalize your data
    • Sometimes your data is massive. Any sort of query can result in a slow result.
    • If data results can be a few hours old, consider a process to push the entire dataset into a separate table with no relationships. Add in indexes after data has been populated.

Service Layer

Translating data tables to objects are a fast way to reduce the number of handwritten queries to perform simple CRUD (Create, Read, Update, Delete) operations.  Tools today can write your entire API layer in less than a minute.

Be careful, complex operations will also slow your application to a halt. 

  • Make use of deferred execution of queries.
  • Reduce the number of queries made to your database. Each query executed can consist of opening a connection, running a query, then closing the connection.
    • Dispose of your database connections. The number of concurrent users can pile up quickly with each query made.  Don’t rely on any sort of garbage collection to take care of this for you.
  • Consider creating a custom function to pull only the data needed for a request
    • Listing files for a user should never contain the file data itself. Use separate functionality only to pull large data when requested
  • Use your database for what it’s good at, serving data!
    • Using stored procedures to handle functionality in a service layer is much more efficient
    • Consider scheduling jobs in your database, instead of writing code to do the same thing
  • Page large datasets
    • Do not execute a query with thousands of records only to show a small subset. Return a small subset when possible.

API Layer

With multiple ways for your customer to access your data, the API Layer allows different devices and services to retrieve the same logic and data results.  For me, the API Layer is nothing more than a presentation layer for nerds. 

  • Return only data needed. For example, adding unused properties not utilized only increases network traffic (which can add up when using the cloud)
  • Require paging where it makes sense. This should be the hook into your service layer to only return a subset of data.
  • Do not place any business logic in the API Layer! Testing business logic should be done in the Service Layer, and not the API Layer.
  • Return responses using the correct HTTP Codes.
  • Consider caching data responses with smaller rarely changing datasets
    • Set a caching refresh time representative of the frequency of data being changed

UI Layer (Web Applications)

Fortunately, some of the fastest ways to fix slow performing Web Applications are within the Presentation Layer.  Use the UI layer as a pretty way to display data.  Here, no business logic should be placed, since the same logic should be used with other presentation layers through the API layer.

  • Minify JavaScript, HTML and CSS files where possible. By simply removing spaces from a file, this small block of code is reduced by around 30%.
  • Bundle minified JavaScript and minified CSS files into a larger file instead of multiple smaller files. Use tools like webpack for easy integration.
    • Careful – avoid using one file for your entire application. Separate site templates which are used on every page globally while only presenting code needed for that page’s functionality.
  • Compress images into Base64. I’ve used base64-img.de to convert images into direct references when using images.
  • Fix broken JavaScript Files, or other resources that are broken.
    • JavaScript errors and 404 errors aren’t helpful for SEO Purposes.
  • Understand and implement correct HTTP Codes.
  • Use compiled HTTP files instead of server rendered code. Most of the modern web frameworks compile out to a distribution folder.
    • Utilize JavaScript to load data instead of a back-end framework which causes a complete page refresh. Asynchronous development allows developers to load multiple pieces of data concurrently instead of waiting for an entire page to load.
    • Static HTML files are extremely fast
  • Make use of your browser’s developer tools (usually by pressing F12), find issues while working on your web applications as code is being generated.
  • Google has a great tool to help identify site issues, which will help with SEO rankings. This includes both desktop and mobile User Experiences.

Wrapping Up

Hopefully, some of the suggestions will help your company deliver a better value to your customer. 

When I graduated from college (in the last millennium, yikes), my mother would call me a few times a week helping her learn about the internet.  This really helped me understand that users do not think like someone with a pocket protector. 

I like to think that if my mother would be using an application I’m using, would she call me asking why it’s running slow?