Current news...


June 23rd: more servers in the server room, Bazooka Hadoop now available for use

More servers

As part of the reorganisation of the London Mathematical Society's (LMS) web presence, two more servers have been installed in the server room to host parts of the website that are currently hosted on existing servers. In addition, a GPU server fitted with four nVidia GPU cards has been installed for a small research group and another server has been set up for an experimental re-implementation of the acoustics section of the SAFE project.

Bazooka Hadoop cluster now back up

Fresh from a complete rebuild, the Bazooka Hadoop cluster in the Stats section is now back up and running the latest version 6.1 of the MapR Converged Data Platform and at the same time, an additional node called athena was added to the cluster. With 64 processor cores and 512 GB of memory, athena is primarily intended to meet the needs of teaching courses but will be available to research users when not being used for teaching.

Attempts to upgrade the previous MapR version 5.2 to the latest version were only partially successful owing to various changes introduced between the versions so in the end the decision was taken to remove the old installation completely along with all the HDFS user data, install a new MapR 6.1 and then restore around 20 TB of user data back to it which took several days to complete.

During this work, the small Churchill test cluster was used to test various options and configurations prior to their being used on the Bazooka cluster and the old Mortar cluster was also brought out of retirement to aid with this. So there are now 3 Hadoop clusters with the two small Churchill and Mortar clusters available for experimental projects.

Older news items:

May 29th: R Shiny server memory replaced & remote management added
April 12th: NextGen cluster Maple 2019 upgrade completed
March 16th: Planned Bazooka Hadoop cluster upgrade, reorganisation of backup servers
February 19th: ma-offsite2 now online
January 19th: Matlab 2018b upgrade ongoing
December 14th: Matlab upgrade to version R2018b started, Stats section compute & storage enhancements completed, silos3 and 4 introduced
September 18th: more local storage for Stats modal server and new PostgreSQL database server launched
August 29th: new 'du' command options, cluster R upgrade and ma-backup3
July 2nd: nvidia3 now has two GPU cards
May 15th: Early summer update
March 29th: Easter update
March 24th: spring update
March 10th: late winter update
December 15th: pre-Christmas update
November 22nd: late November update
October 8th: start of 2017/2018 academic year update
2017: Midsummer's Day update
June 16th, 2017: mid-June update
June 2nd, 2017: Early summer update
April 20th, 2017: Spring update 2
March 22nd, 2017: Early spring update
March 10th, 2017: Winter update 2
February 22nd, 2017: Winter update
November 2nd, 2016: Autumn update
October 21st, 2016: Late summer update 2
October 14th, 2016: Late summer update
February 19th, 2016: Winter update
December 11th, 2015: Autumn update
September 14th, 2015: Late summer update 2
May 2nd, 2015: Spring update 2
April 26th, 2015: Spring update
November 11th, 2014: Autumn update
September 17th, 2014: Summer update 2
July 17th, 2014: Summer update
March 15th, 2014: Spring update
November 2nd, 2013: Summer update
May 24th, 2013: Spring update
January 23rd, 2013: Happy New Year!
November 22nd, 2012: No news is good news...
November 17th, 2011: A revamp for the Maths SSH gateways
September 7th, 2011: Failed systems under repair
August 14th, 2011: Introducing calculus, a new NFS home directory server for research users
July 19th, 2011: a new staging server for the compute cluster
July 19th, 2011: A new Matlab queue and improved queue documentation
June 30th, 2011: Updated laptop backup scripts
June 18th, 2011: More storage for the silo...
June 16th, 2011: Yet more storage for the SCAN...
June 10th, 2011: 3 new nodes added to the Maths compute cluster
May 21st, 2011: Announcing SCAN large storage and subversion (SVN) servers
May 26th, 2011: Reporting missing scratch disk on macomp01
May 21st, 2011: Announcing completion of silo upgrades
May 16th, 2011: Announcing upgrades for silo
April 14th, 2011: Goodbye SCAN 3, hello SCAN 4
March 26th, 2011: quickstart guide to using the Torque/Maui cluster job queueing system
March 9th, 2011: automatic laptop backup/sync service, new collaboration systems launched
May 20th, 2010: Scratch disks are now available on all macomp and mablad compute cluster systems
March 11th, 2010: Introduing job queueing on the Fünf Gruppe compute cluster
October 16th, 2008: Introduing the Fünf Gruppe compute cluster
June 18th, 2008: German City compute farm now expanded to 22 machines
February 7th, 2008: new applications on the Linux apps server, unclutter your desktop
November 13th, 2007: aragon and cathedral now general access computers, networked Linux Matlab installation upgraded to R2007a
September 14th, 2007: Problems with sending outgoing mail for UNIX & Linux users
July 23rd, 2007: SCAN available full-time over the summer vacation, closure of Imperial's Usenet news server
May 15th, 2007: Temporary SCAN suspension, closure of the Maths Physics computer room, new research computing facilities
January 14th, 2005: Exchange mail server upgrade, spam filtering with pine and various other enhancements


Andy Thomas

Research Computing Manager,
Department of Mathematics

last updated: 24.6.2019