Monthly Archives: October 2009

sharepoint webpart install timer service not started

While installing a webpart for Sharepoint today I had the installer routine advise that the timer service was not started. Which the services list told me was incorrect.

After a bit of messing around the solution was to use the Run as Administrator option for the installer.

But, the program list did not offer the right-click option, so I used a command prompt with Run as Administrator and ran the installer from the prompt.

The reference that helped was at codeplex.

Nagios error: /etc/init.d/nagios: line 60: /usr/local/nagios/libexec/check_nagios: No such file or directory

This error occurred for me because the nagios/cgi.cfg contained the incorrect settings.

This is on a Debian Lenny platform and I was not the original installer.  I initially thought it was the nrpe settings and spent sometime working out why the correct settings in the nrpe.cfg were not working. 

Apparently the cgi.cfg takes precedence over the nrpe.cfg, at least in my setup it did!

Sharepoint error: local searchserviceinstance status is bad

Error in the 12 hive log file in a multi server farm. Index and MOSS(WSS) servers are separate.

SearchDataAccessServiceInstance.Synchronize(). local searchserviceinstance status is bad. Object statuses: ssp status Online sspdb Online searchdb Online ssp search service instance Online local search service instance Disabled

Is apparently caused by issues with the SSL certificate used for the MOSS server to connect with the Index server.

http://support.microsoft.com/?id=962928

Provides a 6 step solution.

RSS Aggregation Server Side

This topic will be a work in progress.

I am researching RSS feed tools with the specific purpose of reducing the traffic and hits on external RSS feeders by staff within an office.

So far my reading has included:

So what’s the point, other than I have a limited understanding of RSS?

The point is that on intranet sites the staff are consumers of RSS feeds from external sites. The ability for staff to add their own selection of feeds to their section of the intranet is at the root of this topic while the actual issues are perhaps less obvious.

The first issue is what happens to the external traffic load when multiple users select the same feed as a source for their intranet page. When it’s one or two users, it’s like ‘so what?’ but if 300 staff decide to add http://www.abc.net.au/news/syndicate/topstoriesrss.xml as a direct feed then several things happen.

  • Cost to ABC News for supplying the same data 300 times
  • Cost to the business in downloading (proxy servers, etc, as a peripheral issue)
  • Risk that ABC News see only the business external IP address hitting them 300 x Page Views per day and potentially bans the external IP address from updates.
  • If ABC News is not accessible, then every page with that link will take some time to download while it times out on the RSS feed. (A performance issue that the end user blames on the internal IT department…..)

The second issue is that by allowing open access to RSS feeds there is the risk of editorial control for the business being bypassed and opening up issues of racism, pornography, and the gamut of work-place relation issues that we have to deal with today.

So my goal is to identify a process that provides:

  • RSS Feed Aggregation
  • Internal caching and refresh of content (handling any broken feeds)
  • Provides a controlled output of feeds to the internal masses
  • Assists with managing the feeds from a user request perspective
  • Minimises the traffic to and from the outside world.

I have more reading to do!