Angular + SEO
There’s a phase in every project when development reaches its completion and next step is to make it search engine friendly.
When I had to do this task in my project, at that time I was a little bit confident that this is a very simple task and we just need to set meta tags and some other html tags so that google can crawl easily. But while setting these tags with the help of angular, what we noticed was, crawled pages are not parsed /rendered and have angular expressions ({{}}) for meta tags.
Crawlers (or bots) are designed to crawl HTML content of web pages but due to AJAX operations for asynchronous data fetching, this became a problem as it takes sometime to render page and show dynamic content on it. Similarly, AngularJS also use asynchronous model, which creates a problem for Google crawlers.
Google looks for #! in our site urls and then takes everything after the #! and adds it in _escaped_fragment_ query parameter. Some developers create basic html pages with real data and serve these pages from server side at the time of crawling. So we thought that, why not we render same pages with PhantomJS on serve side which has _escaped_fragment_.
There are few steps which we followed in our application to make it crawlable.
Client Side : –
1. Adding meta tag to the head of your page.
[js]
<meta name="fragment" content="!">
[/js]
This helps search engine crawlers that this page has dynamic JavaScript content that needs to be crawled.
2. If you use hash urls (#), change them to the hash-bang (#!) by adding
[js]
$locationProvider.hashPrefix(‘!’)
[/js]
in your app.js. You would not see any diffrence if you are using html5Mode.
3. As each page may have different seo data so you can create a service or a util method to set SEO data . I would prefer to use $rootScope to store SEO data object so that it will be used in any controller and easily bind in index.html page.
Util Service : –
[js]
(function (window, angular) {
angular.module(‘myApp’).factory(‘UtilService’, [‘$rootScope’, function ($rootScope) {
var seoInformation = function (seoData) {
$rootScope.seoInformation = {};
if (seoData) {
$rootScope.seoInformation.pageTitle = seoData.pageTitle;
$rootScope.seoInformation.metaKeywords = seoData.metaKeywords;
$rootScope.seoInformation.metaDescription = seoData.metaDescription;
}
};
return {
seoInformation: seoInformation
};
}]);
})(window, angular);
[/js]
My Controller :-
[js]
(function (jquery, angular) {
angular.module(‘myApp’).controller(‘MyCtrl’, [‘$state’, ‘UtilService’, function ($state, UtilService) {
var _that = this;
var _setSEOInformation = function() {
var seoInformation = {
pageTitle : ‘My title’,
metaKeywords : ‘My meta keywords’,
metaDescription : ‘My meta Description’
};
UtilService.seoInformation(seoInformation);
};
_that.init = function () {
// Set Seo Information
_setSEOInformation();
};
}]);
})(jQuery, angular);
[/js]
Server side : –
Because we serve our angular app using node server, we use node-phantom module to serve crawle pages by writting a middleware.
1. Require node-phantom module in you file.
[js]
var phantom = require(‘node-phantom’);
[/js]
2. Write a middleware for any request and serve accordinglly.
[js]
app.use(function (request, response, next) {
var pageUrl = request.query["_escaped_fragment_"];
if (pageUrl !== undefined) {
phantom.create(function (err, ph) {
return ph.createPage(function (err, page) {
var fullUrl = request.protocol + ‘://’ + request.get(‘host’) + pageUrl;
return page.open(fullUrl, function (err, status) {
page.get(‘content’, function (err, html) {
response.statusCode = 200;
response.end(html);
ph.exit();
});
});
});
});
} else {
next();
}
});
[/js]
In this way we can make our webpages crawlable in Angular js. Hope this will help you!!
A simple approach would be set a prerender server which use phantomjs to render the page completely.Redirect all traffic from google crawler to this prerender server.
That way you need not to make any change in code and google crawler can also index your page.
github.com/prerender/prerender