Leverage Browser Caching for 3Rd Party Js

Leverage browser caching for 3rd party JS

An annoying issue, Indeed. Not one that is as easily fixable I'm afraid. But what you can do is use a cron.

Firstly, keep in mind that Google are very unlikely to penalise you for their own tools (Like Analytics). However, as mentioned before, it can be fixed using a cron, which basically means you load the JavaScript locally and pull updated scripts.

How to do this:

First of all, you need to download the script that you're running. I will be using Google Analytics as an example (this appears to be the most problematic script people complain about, but you can replicate this for any external scripts).

Look in your code and find the name of the script, in our case it is: google-analytics.com/ga.js. Pop this URL into your web browser and it will bring up the source code. Simply make a copy of it and save it as ga.js.

Save this newly created JavaScript file onto your webserver, in my case:

- JS
- ga.js

Next you will need to update the code on the pages that are calling your script and just change the directory that is calling the JavaScript file. Once again in our case, we will be changing this line:

ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';

to

ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.yoursite.com/js/ga.js';

At this point, your site will now run the script from your website locally! However, this means the script will never update. Unless you re-run this short process every week. That is up to you.. but I'm far too lazy for that.

This is where the CRON comes into play:

Just about every single hosting service will have an option for you to set up cron jobs. On Hostinger it is on your Hosting Panel, on GoDaddy you will find it under the Content option.

Put the following script into your cron, and all you need to do is change the absolute path to the variable $localfile. What this script does is pull the updated script from Google for the ga.js file. You can set the time frame on how often you want it to run this process. Ranging from once every hour to once a month and beyond.

If you're also doing this for external files other than Google Analytics, then you will also need to change the variable $remoteFile. So $remoteFile is the URL to the external JavaScript file and the variable $localFile you will put the path to your new locally stored file, simple as that!

<?
// script to update local version of Google analytics script

// Remote file to download
$remoteFile = 'http://www.google-analytics.com/ga.js';
$localfile = 'ENTER YOUR ABSOLUTE PATH TO THE FILE HERE';
//For Cpanel it will be /home/USERNAME/public_html/ga.js

// Connection time out
$connTimeout = 10;
$url = parse_url($remoteFile);
$host = $url['host'];
$path = isset($url['path']) ? $url['path'] : '/';

if (isset($url['query'])) {
$path .= '?' . $url['query'];
}

$port = isset($url['port']) ? $url['port'] : '80';
$fp = @fsockopen($host, '80', $errno, $errstr, $connTimeout );
if(!$fp){
// On connection failure return the cached file (if it exist)
if(file_exists($localfile)){
readfile($localfile);
}
} else {
// Send the header information
$header = "GET $path HTTP/1.0\r\n";
$header .= "Host: $host\r\n";
$header .= "User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.6) Gecko/20070725 Firefox/2.0.0.6\r\n";
$header .= "Accept: */*\r\n";
$header .= "Accept-Language: en-us,en;q=0.5\r\n";
$header .= "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\n";
$header .= "Keep-Alive: 300\r\n";
$header .= "Connection: keep-alive\r\n";
$header .= "Referer: http://$host\r\n\r\n";
fputs($fp, $header);
$response = '';

// Get the response from the remote server
while($line = fread($fp, 4096)){
$response .= $line;
}

// Close the connection
fclose( $fp );

// Remove the headers
$pos = strpos($response, "\r\n\r\n");
$response = substr($response, $pos + 4);

// Return the processed response
echo $response;

// Save the response to the local file
if(!file_exists($localfile)){
// Try to create the file, if doesn't exist
fopen($localfile, 'w');
}

if(is_writable($localfile)) {
if($fp = fopen($localfile, 'w')){
fwrite($fp, $response);
fclose($fp);
}
}
}
?>

That is it, and should fix any issues you're having with Leverage Browser Caching third party scripts.

Source: http://diywpblog.com/leverage-browser-cache-optimize-google-analytics/

NOTE:

In truth, these files don't tend to have a great effect on your actual page speed. But I can understand the worry you have with Google penalising you. But that would only happen if you had a LARGE amount of these external scripts running. Anything Google related will not be held against you either as I stated earlier.

How can I leverage cache control for 3rd party APIs and widgets in my htaccess file?

For setting cache expiration for Google Analytics (js) this is a good resource:
Leverage browser caching for 3rd party JS

For others download all the js and place them locally on the server.

But I found a more simpler way to do it if you have got cloudflare
(https installed)

Login to your cloudflare account and go to caching tab and set the Browser Cache Expiration level to a minimum 8 days and you are almost done just Purge everything and wait for 5 minutes and try testing your url in

https://developers.google.com/speed/pagespeed/insights/

If still your speed isn't accelerated then try bitly services for some long url for testing and see, mobile speed will definitely accelerate.

How to add Leverage browser caching for CDN in .htaccess?

If you want to use Leverage browser caching for CDN, it's good to cache files by adding some caching headers like Cache-Control, Expires and Last-Modified.

Leverage Browser Caching using Mod_Headers

If you're on a shared server and your hosts won't enable Mod_Expires, you can still leverage browser caching by using Mod_headers, which will be available.

# Leverage browser caching using mod_headers #
<IfModule mod_headers.c>
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
Header set Expires "Wed, 15 Apr 2020 20:00:00 GMT"
Header set Cache-Control "public"
</FilesMatch>
</IfModule>
# End of Leverage browser caching using mod_headers #

below example for testing:

# 1 YEAR
<FilesMatch "\.(flv|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav)$">
Header set Cache-Control "max-age=31536000, public"
</FilesMatch>

# 1 WEEK
<FilesMatch "\.(jpg|jpeg|png|gif|swf)$">
Header set Cache-Control "max-age=604800, public"
</FilesMatch>

# 3 HOUR
<FilesMatch "\.(txt|xml|js|css)$">
Header set Cache-Control "max-age=10800"
</FilesMatch>

# NEVER CACHE - notice the extra directives
<FilesMatch "\.(html|htm|php|cgi|pl)$">
Header set Cache-Control "max-age=0, private, no-store, no-cache, must-revalidate"
</FilesMatch>

Testing The Headers

You can verify if the Cache-Control: max-age header is in place on your files by running a “curl” command like:

curl -I http://foo.bar.netdna-cdn.com/file.ext

HTTP/1.1 200 OK
Date: Fri, 16 Sep 2014 14:12:20 GMT
Content-Type: text/css
Connection: keep-alive
Cache-Control: max-age=604800, public ← 1 Week caching time
Expires: Thu, 21 May 2015 20:00:00 GMT
Vary: Accept-Encoding
Last-Modified: Thu, 24 Jan 2013 20:00:00 GMT
GMT; path=/; domain=.domain.com
Server: NetDNA-cache/2.2
X-Cache: HIT

you have used below code:

Browser Caching using Mod_Expires
The most common way to leverage browser caching is to use mod_expires. The following code can be added to your .htaccess and will automatically enable browser caching for all users.

# Leverage browser caching using mod_expires #
<IfModule mod_expires.c>
ExpiresActive On
ExpiresByType image/jpg "access plus 1 year"
ExpiresByType image/jpeg "access plus 1 year"
ExpiresByType image/gif "access plus 1 year"
ExpiresByType image/png "access plus 1 year"
ExpiresByType text/css "access plus 1 month"
ExpiresByType application/pdf "access plus 1 month"
ExpiresByType text/x-javascript "access plus 1 month"
ExpiresByType application/x-shockwave-flash "access plus 1 month"
ExpiresByType image/x-icon "access plus 1 year"
ExpiresDefault "access plus 2 days"
</IfModule>
# End of Leverage browser caching using mod_expires #

Leveraging browser cache with CloudFlare enabled

EDIT: The answer ended up being a conflict between CloudFlare and the .htaccess file. The comments on this post discuss the troubleshooting and resolution of this issue.

I ran this resource through pingdom's tools to see what the request/response looked like.

https://tools.pingdom.com/#!/d8QPQx/http://www.peppyburro.com/sandboxassets/js/burroinline.js

It is in fact not being cached. The header is set to no-cache.

"no-cache" indicates that the returned response can't be used to
satisfy a subsequent request to the same URL without first checking
with the server if the response has changed. As a result, if a proper
validation token (ETag) is present, no-cache incurs a roundtrip to
validate the cached response, but can eliminate the download if the
resource has not changed.

Source: https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/http-caching

The response from the server is:

Cache-Control public, max-age=216000

Because it is javascript, it may be advisable to extend this to a week or more. Additionally, the response here is public, while your setting is

Header set Cache-Control "max-age=216000, private"

The age is correct, but the visibility is a disparity.

"public" vs. "private"

If the response is marked as "public", then it can be cached, even if
it has HTTP authentication associated with it, and even when the
response status code isn't normally cacheable. Most of the time,
"public" isn't necessary, because explicit caching information (like
"max-age") indicates that the response is cacheable anyway.

By contrast, the browser can cache "private" responses. However, these
responses are typically intended for a single user, so an intermediate
cache is not allowed to cache them. For example, a user's browser can
cache an HTML page with private user information, but a CDN can't
cache the page.

Source: https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/http-caching

I am seeing some cloudflare (CDN) headers as well, marking that it is not cached. Typically, private responses are intended for sensitive content. I would first try to set this as public, but only if you are not concerned with sensitive information.

If you are concerned with sensitive information leave this as private.

While I think this may be the problem, there are several other factors (centered around a CDN) that could also be contributing to the problem.

Accept-Ranges:bytes
Access-Control-Allow-Origin:*
Age:0
Cache-Control:public, max-age=216000
CF-Cache-Status:MISS
CF-RAY:338d062cb1035a6e-BOS
Connection:Keep-Alive
Content-Type:application/javascript
Date:Wed, 01 Mar 2017 15:07:08 GMT
Expires:Sat, 04 Mar 2017 03:07:08 GMT
Last-Modified:Wed, 01 Mar 2017 02:18:53 GMT
Proxy-Connection:Keep-Alive
Server:cloudflare-nginx
Vary:Accept-Encoding
Via:1.1 varnish-v4
X-Varnish:18615326

These are the response headers from the server. They include a "MISS" in CF (cloudflare) caching. Additionally, here the cache control is also set to public.

Because of this, I think that the intermediate CDN may be causing caching issues.

If you have any additional information to provide (such as CDN/CloudFlare information), I would be happy to take another look.



Related Topics



Leave a reply



Submit