robodonoghue
Occasional Observer

Checkpoint Question - Pages should contain metadata description

Having reviewed the Crownpeak Community article about  'Checkpoints - Pages should contain metadata description'. Does the presence of metadata description have any benefits outside of SEO? Should we exclude pages that have a robots.txt: nofollow specified on them? In theory, this would significantly reduce the number of errors for this checkpoint on instances with a high number of nofollow pages. 

0 Kudos
3 Replies
ArisRamos
Crownpeak (Retired)

These can also provide information when smart linking to a specific page, but yes this is specifically for SEO.

It is important to consider that the more conditions added to a checkpoint the longer it takes to analyse the page. This is the same case as commented out code and adding a generic list of possible language variations on some checks. Normally DQM scans is set to identify a specific set of rules and it can be customised to check and exclude sections based on the clients needs. Some would like to  add these exemptions and others prefer to have blanket rules.

Also DQM does not perform multi page comparisons so it does not reference robot.txt

--


Aris Ramos
Head of DQM Product Management, CSPO, CSM

## If I’ve helped, accept this response as a solution so that other’s can find is more quickly in the future.
## Have thoughts on Crownpeak products? We'd love to hear them. Speak with the Crownpeak Product Team..

0 Kudos

Thank you for your response Aris. Is there any way to configure DQM to scan robots.txt? 

0 Kudos

If you mean to check the contents of robots.txt. No it cannot

--


Aris Ramos
Head of DQM Product Management, CSPO, CSM

## If I’ve helped, accept this response as a solution so that other’s can find is more quickly in the future.
## Have thoughts on Crownpeak products? We'd love to hear them. Speak with the Crownpeak Product Team..

0 Kudos