Server (computing)
Server is a computer that provides information to other computers called "clients" on computer network.[1] This architecture is called the client–server model. Servers can provide various functionalities, often called "services", such as sharing data or resources among multiple clients or performing computations for a client. A single server can serve multiple clients, and a single client can use multiple servers. A client process may run on the same device or may connect over a network to a server on a different device.[2] Typical servers are database servers, file servers, mail servers, print servers, web servers, game servers, and application servers.[3]
Client–server systems are usually most frequently implemented by (and often identified with) the request–response model: a client sends a request to the server, which performs some action and sends a response back to the client, typically with a result or acknowledgment. Designating a computer as "server-class hardware" implies that it is specialized for running servers on it. This often implies that it is more powerful and reliable than standard personal computers, but alternatively, large computing clusters may be composed of many relatively simple, replaceable server components.
On the Internet, the dominant operating systems among servers are UNIX-like open-source distributions, such as those based on Linux and FreeBSD,[19] with Windows Server also having a significant share. Proprietary operating systems such as z/OS and macOS Server are also deployed, but in much smaller numbers. Servers that run Linux are commonly used as Webservers or Databanks. Windows Servers are used for Networks that are made out of Windows Clients.
Specialist server-oriented operating systems have traditionally had features such as:
In practice, today many desktop and server operating systems share similar code bases, differing mostly in configuration.
Energy consumption[edit]
In 2010, data centers (servers, cooling, and other electrical infrastructure) were responsible for 1.1-1.5% of electrical energy consumption worldwide and 1.7-2.2% in the United States.[21] One estimate is that total energy consumption for information and communications technology saves more than 5 times its carbon footprint[22] in the rest of the economy by increasing efficiency.
Global energy consumption is increasing due to the increasing demand of data and bandwidth. Natural Resources Defense Council (NRDC) states that data centers used 91 billion kilowatt hours (kWh) electrical energy in 2013 which accounts to 3% of global electricity usage.
Environmental groups have placed focus on the carbon emissions of data centers as it accounts to 200 million metric tons of carbon dioxide in a year.