Details
-
Improvement
-
Resolution: Fixed
-
Neutral
-
None
-
Windows 7
-
-
Yes
-
Empty show more show less
-
Sprint 7 (Kromeriz)
-
2
Description
In current implementation of magnolia-solr-search-provider there is no check for value of "robots" meta tag of page. That causes indexing of all found pages, even if "noindex" value is set to robots meta tag. Problem is on crawler4j side, because it does not respect this flag. Issue is reported on their issue tracker (https://code.google.com/p/crawler4j/issues/detail?id=59) since 2011 and still exists. Possible option is to modify MgnlCrawler's visit(Page p) method to check flag value from parsed content and don't index it in solr if "noindex" flag exists. Possible solution implemented is attached (lines 108-112).
Checklists
Acceptance criteria