[Info-vax] Where to locate software
Kerry Main
kerry.main at backtothefutureit.com
Sat Jun 11 22:07:44 EDT 2016
> -----Original Message-----
> From: Info-vax [mailto:info-vax-bounces at info-vax.com] On Behalf Of
> Stephen Hoffman via Info-vax
> Sent: 11-Jun-16 3:02 PM
> To: info-vax at info-vax.com
> Cc: Stephen Hoffman <seaohveh at hoffmanlabs.invalid>
> Subject: Re: [New Info-vax] Where to locate software
>
> On 2016-06-11 12:45:24 +0000, Kerry Main said:
>
> > Well, as I recall, AltaVista and NorthernLight used to be pretty
> > impressive search engines.
>
> "Alta Vista is a very large project, requiring the cooperation of at
> least 5 servers, configured for searching huge indices and handling a
> huge Internet traffic load."
>
> That comprised a pair of AlphaStation 250 boxes at 266 MHz, an
> AlphaStation 400 at 233 MHz, a DEC 3000 model 900 for spider, and a 300
> MHz AlphaServer 8400 Turbolaser box, with an aggregate of 130
> gigabytes
> of RAM and a half-terabyte of storage. The computers were all
> running Unix.
>
AltaVista on the web was much larger than what you have indicated.
Reference:
https://news.ycombinator.com/item?id=10407678
Hardware requirements for running AltaVista search engine in 1996:
" By 1998, there were way more than five back end servers. I don't
remember exactly how many 8400s we had -- 20? 40? Something in
that range. They'd gone up to 12 GB of RAM apiece, which came on
boards the size of a fairly large baking sheet. The servers were as big
as a refrigerator. The primary datacenter was a floor above PAIX in
the middle of downtown Palo Alto. Pricy server space."
> The web has grown in the ensuing ~twenty years, with attendant
> increases in the user query load, the activity of the crawlers, and
> coping with the effects of SEO, of course.
>
Of course, no one disputes this.
> Having a fast search engine integrated into OpenVMS — just for local
> content — would be handy. We're way past when having a half-
> terabyte
> of storage was notable, after all.
>
Agree that might be useful, but would be a "nice to have" not a critical
Item.
> Being able to maintain OpenVMS and to better structure applications for
> app stacking, now that's also interesting. Those five boxes — were
> they running OpenVMS, and not Unix — would involve maintaining
> stability- and security-related patches and application code to
> current, and promptly rolling out TLS patches and other updates for
> their systems. Preferably also with the applications designed for
> easy deployment. Which is still an entirely locally-implemented
> and/or documented and/or entirely manual process, even twenty years
> on.
>
Northern Light was a popular public web server that ran on OpenVMS
and in the very early 2000's was on par with Google.
Large environments are all custom solutions. Those Customers that
run stock exchanges, lotteries, banks etc. on OpenVMS all have custom
configurations to deal with all of their requirements.
They typically use standardized "gold" images that can be rapidly rolled
out in an hour or less if required. They also typically use clusters to keep
their mission critical App services available while servers are rebooted for
OS patching and OS upgrades etc.
In addition, these environments all have experienced resources that
support these environments.
Course, the same is true for Linux, Windows and every other mission
critical solution out there today. These are not environments where
the SysAdmin is reading "how to install OpenVMS" doc's.
I am not saying there is not room for improvements - there certainly is,
but let's remember that mission critical shops have ways of addressing
anything that impacts them meeting their SLA's today.
Regards,
Kerry Main
Kerry dot main at starkgaming dot com
More information about the Info-vax
mailing list