University Data Center
Last Updated: November 27, 2012 @ 2:33 pm
Next Review Date: 01/01/2014
Service Manager: Lisa G Wright
Governance Group: IT Architecture & Infrastructure
Document Status: Published
- Co-location service availability: 24 hours a day, 365 days a year (excepting defined maintenance windows and events outside the control of ITS)
- Data Center availability: 99.98%
- Network availability at present: 99.9%
- Network availability (after soon to be implementation of LACP bonding standard is adopted for the facility as a whole): 99.96%
This document defines the service level agreement for the University Data Center (UDC).
The UDC provides state-of-the-art co-location facilities for campus departments and researchers at the University of Texas at Austin. It also houses systems running enterprise services critical to students, faculty, and staff. This secure facility provides:
- Server co-location space: Raised floor room equipped with server cabinets, network switches, network connections, and power distribution equipment.
- Campus and Internet connectivity.
- IT facilities to support system administrators in setting up and supporting systems.
- Systems monitoring and notification at the customer's request.
- Physical security and restricted access.
- Information security in accordance with university standards.
- Electrical and mechanical infrastructure designed and built to be concurrently maintainable: engineered for zero downtime.
- Management of climate control, fire suppression, and power systems.
- Consultation and assessment for academic and administrative units that want to move equipment into the UDC.
The raised floor space in the UDC provides an environmentally controlled facility for housing servers and related IT equipment. The West Hall raised floor area is completely equipped with:
- Server cabinets.
- Top-of-rack switches for redundant connections to the campus network.
- Power strips and power distribution equipment for redundant connections to power supply.
- Patch cables for connecting servers to top-of-rack switches.
Because the West Hall is fully equipped with cabinets, all IT equipment located in the West Hall must be 19-inch, four-post rack mountable (as outlined in the "Supported Computer Environment" section of this document). UDC staff determine server locations based on power and cooling management and customer business needs.
The East Hall of the UDC will be built out in the future and may have different specifications.
Power and Backup Power
Power is provided from an A and B feed to every server for redundancy. Each server equipped with dual power supplies is connected redundantly to the power source. The redundant power connections allow servers to retain power even during maintenance and unplanned events. Maintenance will be performed on only one side of the electrical system at a time.
Primary power is provided by the University of Texas at Austin power plant, which is backed up by City of Austin power. The UDC is equipped with uninterruptible power supplies (UPSs) that take over in the event of a loss of power while the backup generators come online. Backup generators are connected to the A and B feeds and have at least 24-hour fuel supplies.
Servers that do not have dual power supplies connected redundantly to the power source will go down during scheduled electrical maintenance and unplanned events on the side to which they are connected.
The network equipment connects co-located servers to the campus network, providing out-of-campus bandwidth and Internet connectivity. Connections are available at two speeds:
- Standard connection: 1 gigabit per second (Gbps) Ethernet connection (up to 4 per device, more for extra cost).
- Limited availability: 10 Gbps Ethernet connection. (extra cost)
Network connectivity is provided from A and B sources to every server for redundancy. Each server equipped with dual network interface cards (NICs) is connected redundantly to both the A and B sides of the network via the top of rack switches. The redundant network connection allows properly configured servers to retain network connectivity during most regularly scheduled maintenance and unplanned events. Maintenance will be performed on only one side of the network at a time when possible.
Under the following conditions, servers can go down during scheduled maintenance and unplanned events on the side to which they are connected:
- If they do not have dual NICs connected redundantly to the network.
- If they are redundantly connected but not properly configured to use the network equipment using link aggregation control protocol (LACP) bonding.
Cooling is provided by a redundant cooling system that maintains the raised floor space at a temperature between 70 to 74 degrees (Fahrenheit) and humidity between 35 to 60 percent.
The UDC is equipped with facilities for system administrators:
- A visitor work area and break room outside of the raised floor space, equipped with workstations, network ports, wireless access, and power outlets.
- A server build room equipped with server cabinets, an A/B network, and a UPS, where staff can install, build, and configure systems prior to their installation in cabinets in the raised floor space. The server build room has only a single power source. Systems are not considered "in production" when they are in the server build room. Systems that are staged in the server build room must be installed in the raised floor space within 14 calendar days from staging. Space in the server build area is subject to availability and should be scheduled in advance with UDC staff.
The data center has a state of the art security system and stringent security processes. Physical security for the UDC is governed by the University Data Center Security Policy.
Intrusion detection and data exfiltration detection services are provided by the Information Security Office (ISO). After-hours and weekend coverage are included. "In computer terminology, Exfiltration refers to the unauthorized release of data from within a computer system. This includes copying the data out through covert network channels or the copying of data to unauthorized media." (Source: http://en.wikipedia.org/wiki/Exfiltration)
Remote Server Access
The UDC has been designed with an "out of band" network (OOBN) that provides system console access to system administrators from remote locations. All servers are required to have a Remote Management port (port types vary by manufacturer, but all should be Ethernet capable).
The OOBN is configured such that all management networks will be in private address space. Access to the OOBN will be provided in general through the campus VPN system. The OOBN network is "best effort," and no production services should attempt to use this network for regular service.
UDC is responsible for creating a Disaster Facilities Plan consisting of a customer list, customer contacts, power and rack requirements, key facility related contacts, and escalation and communication paths related to the equipment in its facilities. Customers will be responsible for developing their own disaster recovery plans for servers and services.
Ownership of Data/Data Management
See the Glossary for a definition of terms in quotes.
Information Technology Services runs the University Data Center (UDC) as an "Information Technology Resources Facility." It houses "Information Technology Resources" and "Information Systems" for colleges, schools and units (customers) within the university. The facility includes a number of "Access Controls" to help ensure the physical and information security of systems.
The UDC staff includes "Network Custodians" to assist in the implementation of network controls within the facility as directed by the owner and data steward. Customers may employ "System Administrators" directly to manage their Information Technology Resources and Systems within the UDC, or customers may contract with ITS to provide System Administrators. In the event that a customer contracts with ITS for System Administration, ITS also assumes a joint "Custodian" role with the customer. The customer maintains responsibility as the "Data Steward" and "Owner" in all cases.
Ownership of Data: All data stored, collected and maintained on servers or other equipment placed on behalf of a college, school or unit at the UDC shall remain the property of the college, school or unit, and Data Steward and Data Owner responsibilities reside with the customer. The customer understands and agrees that any subpoenas, court orders, requests under the Texas Public Information Act or other demands that ITS receives for a customer's data from any third party to access the customer's data must be directed, within one business day of its receipt by ITS, to the college, school or unit. ITS is not authorized to and shall not release or permit a third-party to access a college, school or unit's data without the express permission of the college, school or unit. ITS may not access or use a college, school, or unit's data for any reason not specifically authorized in this agreement without the express permission of the college, school or unit.
Security of Data: The Data Owner for the college, school or unit is responsible for identifying the category of data stored on Information Technology Resources and Systems and for assigning a level of protection appropriate to that category, in compliance with the Minimum Security Standards for Systems and related policies and standards.
The University Data Center can be used by students, faculty, and staff.
Supported computing environment
The UDC maintains minimum standards for equipment that can be accommodated in its facility. These minimum system standards preserve power and cooling requirements for use by all departments and allow the most efficient use of campus resources. UDC server cabinets can accommodate systems that conform to the following specifications:
- Rack Support: 19-inch, four-post rack-mountable equipment
- Rails: Order rails with NO cable management
- Power Supply: Dual 208/220 volt power supplies for redundant power connectivity
- Power Cords: Order C13 (limited C19 capability)
- Dual Network Interface Cards (NICs) specifically for redundant network connectivity
- Remote Management: Ethernet-capable remote server management port, for example, Dell Remote Access Controller (DRAC) or Integrated Lights Out Manager (ILOM)
Systems that are rack mountable and meet Minimum Security Standards for Systems but do not have dual NICs, dual power supplies, or remote management capabilities can be installed with a corresponding effect on redundancy and remote management capabilities (refer to "Power and Backup Power," "Network," and "Remote Server Access" in this document for an explanation of the impacts). These configurations will be discussed on a case-by-case basis and documented. It may also be possible to modify existing systems to add power supplies, NICs and remote management ports.
All ancillary equipment should be proximate to service equipment and ideally will be located within the same rack. Examples of ancillary equipment include:
- Load balancers
- Tape backup systems (internal or external)
- External storage/fibre channel connections
- IP-based KVM (keyboard, video, mouse) switches
- IP-based serial console servers
- Network switches
Ancillary equipment will be discussed on a case-by-case basis and documented.
- Maintenance of the data center chilled water and environmental systems.
- Monitoring and control of climate conditions in the data center.
- Responding to climate-related alerts resulting from variations from climate conditions that exceed system thresholds (for example, excessive temperatures, hot spots, etc.).
Fire Detection and Suppression
- Early warning detection system.
- Individually zoned, double-interlocked pre-action fire suppression system.
- Dry pipe with water suppression in the event of a fire.
- Maintenance of all sensors and alarms.
- Maintenance of the fire suppression system.
- Monitoring of, and response to, all related alerts and alarms.
- Compliance with relevant certification and fire code requirements.
- Maintenance of the uninterruptible power supply system.
- Maintenance of the back-up generators and related systems.
- Connection to the electrical source.
Grounds and building maintenance.
The university maintains building insurance to cover the cost of facilities and equipment should an insurable event occur, subject to standard deductibles.
All requests for technical support will be logged using the ITS centralized ticketing system.
Tier 1 Support
All requests for technical support will be logged using the ITS centralized ticketing system to enable us to appropriately assign and track the progress of your request. Staff will coordinate with customers to complete all tasks. Customers will be informed by e-mail from the ticketing system when requests have been completed.
- For service requests that do not require immediate attention, send an e-mail message to UDCemail@example.com. This will create a ticket in Footprints and will follow the standard ITS SLA initial response time of 4 hours.
- For any issue requiring immediate attention (less than 4 hours), such as the reboot of a server or other troubleshooting activity, call 471-0007. The operators are on duty 24/7 to support you and will respond within 20 minutes. You may be asked to follow up with an e-mail to UDCfirstname.lastname@example.org. Customers must provide a contact method when they contact the UDC.
||8 a.m. to 5 p.m. M-F|
|Remote Hands||Typically simple tasks that can be performed in a short
timeframe and might otherwise require on-site visits by system
administrators. Services include:
|Networking||Any network configuration change or recabling.
||8 a.m. to 5 p.m. M-F
After Hours: 7 to 10 a.m. First Saturday of the month
5 to 8 p.m. Third Thursday of the month
|Smart Hands||Involves handling hardware or performing invasive tasks
in the back of a cabinet. Invasive tasks require advance notification to
other cabinet occupants and must be supported by ITS staff.
||7 a.m. to 7 p.m. M-F
7 to 10 a.m. First Saturday of the month
To schedule support in an after-hours window, open a ticket UDCemail@example.com five days in advance of the window. The windows will be closed and staff released if no activities are logged 5 days before the window. Customers may conduct their own maintenance not requiring ITS staff at any time.
Urgent Support After Hours
Urgent requests should be communicated by telephone to 512-471-0007 and followed up with e-mail to UDCfirstname.lastname@example.org. If the UDC Operators cannot resolve the issue, they will escalate to appropriate tier 2 staff.
Monitoring and Notification
Monitoring software in the UDC will monitor publicly available ports on co-located systems as defined by customers. Examples of ports that can be monitored are 80 (HTTP) and 22 (SSH). Customers will be notified in the event of a failed response according to agreed-upon escalation paths.
Recurring maintenance on monitoring software is scheduled for the first and third Thursday of each month, from noon to 2 p.m. If maintenance activities are scheduled to occur, the UDC will notify customers. Support for critical outages in the Zenoss environment is 24x7x365 “best-effort” via a documented escalation path.
Moving to the Data Center
UDC staff will provide a process, consulting, and move assistance for customers who wish to move equipment into the data center. If the equipment is too much for ITS staff to move with existing vehicles, the customer may be asked to bear the cost of moving equipment to the facility. ITS can provide recommendations for vendors to perform this work.
For existing customers being relocated to UDC-C from another ITS operated facility, the cost of a move vendor is borne by ITS.
ITS will notify customers about both scheduled and unscheduled maintenance using the ITS Services Status page of service availability and service delivery issues. Services may not be available during the maintenance periods.
To the maximum extent possible, installation of service, application, and security updates will be performed during scheduled maintenance.
|System||Schedule during ramp-up period (Begins at building occupancy)||Schedule in production (September 1, 2011)|
|Network||Follows the Backbone, External Connectivity and Services Maintenance (impact is campus wide) schedule outlined in the Networks for Departments SLA.||Monthly or Quarterly, on only one side of the network at a time. (Estimated)|
|Power and cooling infrastructure||None||Quarterly, on only one side of the infrastructure at a time. (Estimated)|
Unscheduled maintenance tasks that require service downtime will be announced as soon as possible on the ITS Services Status page.
ITS will maintain a mailing list of customer contacts who will be notified of planned maintenance and unplanned events. Customers must notify ITS of any changes to contact information as part of providing escalation path information. Contact lists will be reviewed periodically.
ITS will notify customers using the ITS Services Status page of service availability and service delivery issues for the University Data Center. To the maximum extent possible, installation of service, application, and security updates will be performed during scheduled maintenance.
Access to Facilities
Access to facilities is governed by the UDC Security Policy, and access by unauthorized personnel is prohibited.
|Visitor work space||Business hours||Required after hours||No|
|Server build room||Business hours||Required after hours||No|
|Raised floor||24 hours||Always required (see Tier 1 support)||Yes|
Normal service availability is defined in the following table:
|Co-location service||24 hours a day, 365 days a year (excepting defined maintenance windows and events outside the control of ITS)|
|Remote Hands||24 hours a day, 365 days a year|
|Smart Hands||7 a.m. to 7 p.m., Monday - Friday, on demand. Can be scheduled in advance all other hours.|
In addition to the services provided by ITS, subscribers (users) of the service and identified owners/administrators agree to certain important responsibilities. All parties agree to be aware of and adhere to the university's Acceptable Use Policy.
Customers agrees to:
- Provide vendor name, model number, and specifications for equipment to be co-located.
- Follow documented communications and ticketing process.
- Order systems that are consistent with the supported system standards defined in the Supported Computing Environment section. When considering systems that may not conform to the standard, customer agrees to consult with ITS prior to purchasing.
- Properly configure systems to use the redundant network equipment provided in the facility.
- Develop disaster recovery plans for servers and services.
- Identify staff who are authorized for remote and on-site (escorted) access to the facility and systems co-located therein.
- Define an escalation path outlining who should be contacted and when in the event of problems with systems that are monitored by UDC staff.
- Abide by security procedures that control access to the facility.
- Maintain systems according to the Minimum Security Standards for Systems.
- Manage the hardware lifecycle of systems co-located in the UDC.
Cost of Service
Cost information for this service can be found on the University Data Center Web site.
Access controls are the means by which the ability to use, create, modify, view, etc., is explicitly enabled or restricted in some way (usually through physical and system-based controls).
Guardian or caretaker; the holder of data, the agent charged with implementing the controls specified by the owner. The custodian is responsible for the processing and storage of information. The custodians of information resources, including entities providing outsourced information resources services to the university, must:
- Implement the controls specified by the owner(s).
- Provide physical and procedural safeguards for the information resources.
- Assist owners in evaluating the cost-effectiveness of controls and monitoring.
- Implement the monitoring techniques and procedures for detecting, reporting, and investigating incidents.
University representatives, such as faculty, staff, or researchers, who are tasked with managing administrative and/or research data owned by the university. Such data is to be managed by a data steward as a university resource and asset. The data steward has the responsibility of ensuring that the appropriate steps are taken to protect the data and that respective policies and guidelines are being properly implemented. Data Stewards may delegate the implementation of university policies and guidelines to professionally trained campus or departmental IT custodians.
Information Technology Resources
Any and all computer printouts, online display devices, mass storage media, and all computer-related activities involving any device capable of receiving e-mail, browsing web sites, or otherwise capable of receiving, storing, managing, or transmitting data including, but not limited to, mainframes, servers, personal computers, notebook computers, hand-held computers, PDAs, pagers, distributed processing systems, network attached and computer controlled medical and laboratory equipment (that is, embedded technology), telecommunication resources, network environments, telephones, fax machines, printers, and service bureaus. Additionally, it is the procedures, equipment, facilities, software, and data that are designed, built, operated, and maintained to create, collect, record, process, store, retrieve, display, and transmit information.
Information Technology Resources Facilities
Any location that houses information technology resource equipment (includes servers, hubs, switches, and routers). Facilities are usually dedicated rooms or mechanical/wiring closets in the buildings.
An interconnected set of information resources under the same direct management control that shares common functionality. An Information System normally includes hardware, software, information, data, applications, communications and people.
Network manager or analyst; the holder of network configuration data, the agent charged with implementing the network controls and services specified by the owner or the university. This custodian is responsible for the transfer of information. These custodians, including entities providing outsourced information resources services to the university, must:
- Implement the network controls specified by the owner or the university.
- Provide physical and procedural safeguards for the network infrastructure.
- Assist owners in evaluating the cost-effectiveness of controls and monitoring.
- Implement the monitoring techniques and procedures for detecting, reporting, and investigating or troubleshooting network incidents.
The authoritative head of the respective college, school, or unit. The owner is responsible for the function that is supported by the resource or for carrying out the program that uses the resources. The owner of a collection of information is the person responsible for the business results of that system or the business use of the information. Where appropriate, ownership may be shared by managers of different departments. The owner or his designated representatives are responsible for and authorized to:
- Approve access and formally assign custody of an information resources asset.
- Determine the asset's value.
- Specify and establish data control requirements that provide security, and convey them to users and custodians.
- Specify appropriate controls, based on risk assessment, to protect the state's information resources from unauthorized modification, deletion, or disclosure. Controls shall extend to information resources outsourced by the university.
- Confirm that controls are in place to ensure the accuracy, authenticity, and integrity of data.
- Confirm compliance with applicable controls.
- Assign custody of information resources assets and provide appropriate authority to implement security controls and procedures.
- Review access lists based on documented security risk management decisions.
Person responsible for the effective operation and maintenance of Information Technology Resources, including implementation of standard procedures and controls, to enforce the university's security policy.
Trouble viewing the documents available on this page? Download the Adobe PDF Reader.