Current news...

March 29th: Easter update

The weather seems to have improved so maybe spring is on its way after all.

Introducing lovelace: Nine years ago, before we went down the managed cluster computing route, all Linux compute servers in Maths were essentially stand-alone systems that anyone could use interactively, logging in via ssh, running programs from the command line and using X forwarding over ssh to run GUI programs such as xmaple and Matlab. The move to batch-mode cluster computing has removed this facility which has disadvantaged users who don't want to learn how to use the cluster and short term visitors alike, so this new addition will restore the ad hoc compute facilities we used to have. With 4 CPUs, 64 processor cores and 512 GB of memory, there should be enough compute resources for all. More info

SCAN running full-time in Huxley 410 over Easter: running full-time in Huxley 410 from 8pm Monday March 26th until 05:50 on Monday April 30th except for a 7.5 hour maintenance window every Thursday morning between midnight Wednesday and 07:30 on Thursday morning. This is to allow ICT to carry out routine weekly Windows updates and system maintenance and the systems will reboot back into SCAN mode just after 07:30am on Thursdays.

Only one node remaining in legacy cluster: macomp17 from the legacy cluster has been taken out of service, upgraded and moved across campus to join the NextGen cluster, leaving just one compute node, macomp14, remaining while an existing user job completes. The Nextgen cluster is running well and as spring approaches, usage is expected to increase.

A dedicated MySQL server has been introduced to meet the demand for a higher throughput server for projects co-hosted between Maths and other departments. This will replace the MySQL service provided by macomp00 which was originally added as an afterthought to this server to meet a project requirement from Bioinformatics that has since grown far beyond the original expectations. Fittingly, the new server is called and the existing macomp00 facility will remain in place for the forseeable future while users with legacy databases on it decide what to do with them.

Other news: a number of projects for several research groups have just been started - more info to follow when there is more to report.

Older news items:

March 24th: spring update
March 10th: late winter update
December 15th: pre-Christmas update
November 22nd: late November update
October 8th: start of 2017/2018 academic year update
2017: Midsummer's Day update
June 16th, 2017: mid-June update
June 2nd, 2017: Early summer update
April 20th, 2017: Spring update 2
March 22nd, 2017: Early spring update
March 10th, 2017: Winter update 2
February 22nd, 2017: Winter update
November 2nd, 2016: Autumn update
October 21st, 2016: Late summer update 2
October 14th, 2016: Late summer update
February 19th, 2016: Winter update
December 11th, 2015: Autumn update
September 14th, 2015: Late summer update 2
May 2nd, 2015: Spring update 2
April 26th, 2015: Spring update
November 11th, 2014: Autumn update
September 17th, 2014: Summer update 2
July 17th, 2014: Summer update
March 15th, 2014: Spring update
November 2nd, 2013: Summer update
May 24th, 2013: Spring update
January 23rd, 2013: Happy New Year!
November 22nd, 2012: No news is good news...
November 17th, 2011: A revamp for the Maths SSH gateways
September 7th, 2011: Failed systems under repair
August 14th, 2011: Introducing calculus, a new NFS home directory server for research users
July 19th, 2011: a new staging server for the compute cluster
July 19th, 2011: A new Matlab queue and improved queue documentation
June 30th, 2011: Updated laptop backup scripts
June 18th, 2011: More storage for the silo...
June 16th, 2011: Yet more storage for the SCAN...
June 10th, 2011: 3 new nodes added to the Maths compute cluster
May 21st, 2011: Announcing SCAN large storage and subversion (SVN) servers
May 26th, 2011: Reporting missing scratch disk on macomp01
May 21st, 2011: Announcing completion of silo upgrades
May 16th, 2011: Announcing upgrades for silo
April 14th, 2011: Goodbye SCAN 3, hello SCAN 4
March 26th, 2011: quickstart guide to using the Torque/Maui cluster job queueing system
March 9th, 2011: automatic laptop backup/sync service, new collaboration systems launched
May 20th, 2010: Scratch disks are now available on all macomp and mablad compute cluster systems
March 11th, 2010: Introduing job queueing on the Fünf Gruppe compute cluster
October 16th, 2008: Introduing the Fünf Gruppe compute cluster
June 18th, 2008: German City compute farm now expanded to 22 machines
February 7th, 2008: new applications on the Linux apps server, unclutter your desktop
November 13th, 2007: aragon and cathedral now general access computers, networked Linux Matlab installation upgraded to R2007a
September 14th, 2007: Problems with sending outgoing mail for UNIX & Linux users
July 23rd, 2007: SCAN available full-time over the summer vacation, closure of Imperial's Usenet news server
May 15th, 2007: Temporary SCAN suspension, closure of the Maths Physics computer room, new research computing facilities
January 14th, 2005: Exchange mail server upgrade, spam filtering with pine and various other enhancements

Andy Thomas

Research Computing Manager,
Department of Mathematics

last updated: 29.03.2017