Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

If VIVO's dynamically-generated pages do not exhibit acceptable load times, you may wish to enable HTTP caching.  See Use HTTP caching to improve performance.  With this configuration, subsequent requests for pages whose contents have not changed will be result in those pages being served directly from a cache instead of being regenerated from data in the triple store.

Misbehaving robots

In some cases, poor VIVO performance has been traced to search index robots that either ignore or misread directives in VIVO's robots.txt file, or which issue requests for large pages at a rate that greatly exceeds the demand otherwise encountered in typical production use.  If the search index in question is not critical to VIVO's visibility, it may be advisable to restrict access to the associated robots.  In some situations, institutional search appliances are responsible for the excessive server load.  Here, discussions with local IT staff may be warranted.

Additional discussion

Work in progress at https://docs.google.com/a/symplectic.co.uk/document/d/1ylp9HEzJiBsBP6vx1vd-Irf8o3Ff-5vDhytOVTI5_Ho/edit#heading=h.vdtjwwvnjdn7