Need for Speed, COVID-19 and Data Cleanses

Over the last two months, things have gotten crazy, no thanks to the coronavirus. One of the things, I was able to launch relatively quickly was the Coronavirus Press, which segments headlines by various topics like statistics and political/financial impact. The press engine does its thing every day and it’s a low-maintenance project.

Two weeks ago, traffic spiked to a near-thousand users mark per day, which was incredibly exciting … until I saw the acquisition sources for all them, which were paid channels (read: spam traffic). Enter the data cleanse.

After perusing through days prior to the traffic surge, I noticed a pattern from the spam bots that were trying to simulate increased user growth. Setting up a few filters to normalize the GA data, things looked a lot more, well, depressing with the numbers nearly gutted in half. Scott Belsky touched on the vanity number complex in his book, which made me feel a tad better, but still … no bueno.

All this made me question if it was a speed-related issue. GT Metrix, Google Pagespeed Insights and Pingdom have been pretty kind, in terms of giving it a pretty good grade. However, there’s always room for improvement in web development land, and I’ve mulled over the just cutting out most javascripts completely. A lot of the functionality I’ve built into my press engines haven’t been totally utilized and come at a cost of about 200-300KB worth of page data … also no bueno. Can I cut all of it out? Sure. Would it improve loading time. Heck yeah. Am I going to do it? Not sure.

Now, it’s on to forecasting whether a js-free network would have a bigger impact on traffic versus pumping out original content. After all, I just remembered I’m doing my own data dumps every several weeks to keep things up and running on my databases. I’m almost thinking the latter. But there’s no reason to try both simultaneously and that’s the hardest decision to make.