March 2nd: another backup server added
- ma-backup4 was added today and is already mirroring data from one of the three compute servers used by the joint Maths:Physics research group. This joins the three existing on-site backup servers ma-backup1 to ma-backup3 inclusive which are now getting very full. Using the same architecture as the existing servers, ma-backup4 has 32 TB of installed disk to provide 24 TB of usable resilient storage. The offsite server in Slough, ma-offsite2, which also hosts ma-offsite3 will be expanded to to create ma-offsite4.
- These four backup servers backup user data from all of the various compute servers - most of these have fast local disk storage facilities in addition to networked storage, which needs to be regularly backed up.
February 26th onwards: internal network expansion, phase 2
- There are 110 college network connections in the Maths server room, Huxley 616, with over 170 systems installed and recent requests to host yet more systems that need to be accessible over the college network. Installing more college network sockets in the room is no longer an option owing to the disruption it will cause and also, the cost; there are no spare ports left on the existing network switches in the level 6 comms cupboard so at least one new switch plus a rack to contain it will be required which is expensive.
- A dormant 'background project' that began about two years ago to move systems that did not really need college network connections over to dedicated internal networks and, at the time, freed up about 10 college network connections, has restarted in earnest. We are now moving more systems to internal networks which involves installing additional network switches, cabling and where required, extra network interfaces into systems that do not have sufficient network ports.
- Work is ongoing but since February 25th, four college network connections have been released for use by the imminent ma-backup4 and new GPU servers and a proposed private storage server.
Coming soon: a new GPU server
- A new GPU server has been delivered which will greatly improve our GPU computing facilities. Fitted with dual 16-core Xeon CPUs, 1.5 terabytes of memory and 14 hard disks providing a total usable local storage capacity of ~30 terabytes, eight nVidia RTX 2080i GPUs each containing 4352 GPU cores and 11 GB of memory are fitted. With support for nVidia's cuDNN (Deep Neural Network) libraries, this new addition is expected to become very popular.
- We are now awaiting completion of electrical power feeds in Huxley 616, the installation of another 8 kVA UPS and the delivery of a rack before the new GPU server can be installed.
Older news items:
- February 6th: more large compute servers for Stats
- November 1st: NextGen is shutting down on November 1st in readiness for relocation
- September 29th: matlab2018 queue has now been discontinued on the NextGen cluster
- August 24th: matlab2018 queue to be discontinued on the NextGen cluster
- August 14th: more servers for Stats, remote monitoring upgrades, better system status reporting and more power supplies
- June 23rd: more servers in the server room, expanded Bazooka Hadoop cluster now available for use
- May 29th: R Shiny server memory replaced & remote management added
- April 12th: NextGen cluster Maple 2019 upgrade completed
- March 16th: Planned Bazooka Hadoop cluster upgrade, reorganisation of backup servers
- February 19th: ma-offsite2 now online
- January 19th: Matlab 2018b upgrade ongoing
- December 14th: Matlab upgrade to version R2018b started, Stats section compute & storage enhancements completed, silos3 and 4 introduced
- September 18th: more local storage for Stats modal server and new PostgreSQL database server launched
- August 29th: new 'du' command options, cluster R upgrade and ma-backup3
- July 2nd: nvidia3 now has two GPU cards
- May 15th: Early summer update
- March 29th: Easter update
- March 24th: spring update
- March 10th: late winter update
- December 15th: pre-Christmas update
- November 22nd: late November update
- October 8th: start of 2017/2018 academic year update
- 2017: Midsummer's Day update
- June 16th, 2017: mid-June update
- June 2nd, 2017: Early summer update
- April 20th, 2017: Spring update 2
- March 22nd, 2017: Early spring update
- March 10th, 2017: Winter update 2
- February 22nd, 2017: Winter update
- November 2nd, 2016: Autumn update
- October 21st, 2016: Late summer update 2
- October 14th, 2016: Late summer update
- February 19th, 2016: Winter update
- December 11th, 2015: Autumn update
- September 14th, 2015: Late summer update 2
- May 2nd, 2015: Spring update 2
- April 26th, 2015: Spring update
- November 11th, 2014: Autumn update
- September 17th, 2014: Summer update 2
- July 17th, 2014: Summer update
- March 15th, 2014: Spring update
- November 2nd, 2013: Summer update
- May 24th, 2013: Spring update
- January 23rd, 2013: Happy New Year!
- November 22nd, 2012: No news is good news...
- November 17th, 2011: A revamp for the Maths SSH gateways
- September 7th, 2011: Failed systems under repair
- August 14th, 2011: Introducing calculus, a new NFS home directory server for research users
- July 19th, 2011: a new staging server for the compute cluster
- July 19th, 2011: A new Matlab queue and improved queue documentation
- June 30th, 2011: Updated laptop backup scripts
- June 18th, 2011: More storage for the silo...
- June 16th, 2011: Yet more storage for the SCAN...
- June 10th, 2011: 3 new nodes added to the Maths compute cluster
- May 21st, 2011: Announcing SCAN large storage and subversion (SVN) servers
- May 26th, 2011: Reporting missing scratch disk on macomp01
- May 21st, 2011: Announcing completion of silo upgrades
- May 16th, 2011: Announcing upgrades for silo
- April 14th, 2011: Goodbye SCAN 3, hello SCAN 4
- March 26th, 2011: quickstart guide to using the Torque/Maui cluster job queueing system
- March 9th, 2011: automatic laptop backup/sync service, new collaboration systems launched
- May 20th, 2010: Scratch disks are now available on all macomp and mablad compute cluster systems
- March 11th, 2010: Introduing job queueing on the Fünf Gruppe compute cluster
- October 16th, 2008: Introduing the Fünf Gruppe compute cluster
- June 18th, 2008: German City compute farm now expanded to 22 machines
- February 7th, 2008: new applications on the Linux apps server, unclutter your desktop
- November 13th, 2007: aragon and cathedral now general access computers, networked Linux Matlab installation upgraded to R2007a
- September 14th, 2007: Problems with sending outgoing mail for UNIX & Linux users
- July 23rd, 2007: SCAN available full-time over the summer vacation, closure of Imperial's Usenet news server
- May 15th, 2007: Temporary SCAN suspension, closure of the Maths Physics computer room, new research computing facilities
- January 14th, 2005: Exchange mail server upgrade, spam filtering with pine and various other enhancements
Research Computing Manager,
Department of Mathematics
last updated: 2.03.2020