This topic will be a work in progress.
I am researching RSS feed tools with the specific purpose of reducing the traffic and hits on external RSS feeders by staff within an office.
So far my reading has included:
So what’s the point, other than I have a limited understanding of RSS?
The point is that on intranet sites the staff are consumers of RSS feeds from external sites. The ability for staff to add their own selection of feeds to their section of the intranet is at the root of this topic while the actual issues are perhaps less obvious.
The first issue is what happens to the external traffic load when multiple users select the same feed as a source for their intranet page. When it’s one or two users, it’s like ‘so what?’ but if 300 staff decide to add http://www.abc.net.au/news/syndicate/topstoriesrss.xml as a direct feed then several things happen.
- Cost to ABC News for supplying the same data 300 times
- Cost to the business in downloading (proxy servers, etc, as a peripheral issue)
- Risk that ABC News see only the business external IP address hitting them 300 x Page Views per day and potentially bans the external IP address from updates.
- If ABC News is not accessible, then every page with that link will take some time to download while it times out on the RSS feed. (A performance issue that the end user blames on the internal IT department…..)
The second issue is that by allowing open access to RSS feeds there is the risk of editorial control for the business being bypassed and opening up issues of racism, pornography, and the gamut of work-place relation issues that we have to deal with today.
So my goal is to identify a process that provides:
- RSS Feed Aggregation
- Internal caching and refresh of content (handling any broken feeds)
- Provides a controlled output of feeds to the internal masses
- Assists with managing the feeds from a user request perspective
- Minimises the traffic to and from the outside world.
I have more reading to do!