Despite my lack of technical knowledge, I know enough to see the major problems with the way we do computing in the global scale.
On the user's level it's all okay - everything you own and manage is probably already in the cloud of the storage service you use, leaving your hard drive considerably less stuffed, compared to ten years ago, when we had to back up everything on a CD, because we wanted a new game, or new movie.
However, our data centres are getting bigger and bigger, stuffed with more and more servers, which soak more and more energy from the power plants. The current solution to the problem of storage is just - Build more centres! And it worked pretty well for the past years, including right now, but all that is changing and we're in dire need of a new solutions.
Between the TV, Xbox, Playstation, Phone, Tablet, Laptop and Desktop Computer, humanity simply produces far greater amounts of data than it can effectively store. Thus, it deletes most of it, rather than use it in it's research and progression.
However, even that "most important" data that ends up in the cloud will soon deplete our capacity to store and process it. Even, if we build more centres and pack them full of servers, they will require so much energy to operate, we'd have to blackout our cities, in order for our computers to run properly.
Right now, our data storage, transport and processing takes as much energy as Japan does.
During the conference, it was made clear Hewlett Packard plans to change that with their newest project - The Machine.
*Note: The name was thought up by engineers, so have pity on them. |
The new system uses three revolutionary sub-systems to deliver those hard promises:
1. Rather than using generalised cores that process a variety of types of data simultaneously, The Machine utilises clusters of specialised cores that handle specific types of information each. Doing so, the cores can be optimized and designed specifically for one task, which increases their efficiency and speed.
2. The energy thirsty and slow copper wiring is replaces with photonic silicon connectors. That allows for much faster, much cheaper transfer of information between the internal components of the system. Also, it would require less cooling power to keep in check than the standard copper material.
3. Memristors to take the lead in operating memory chips. We already know HP Labs is no too good with names, so lets skip on to what the "memristors" actually are. Rather than trying to make things smaller and smaller, which is the current notion in computing advancements, they've embarked on a completely different milestone.
The Memory transistor is a component, capable of both data processing and storage operations. It can continue it's function while the power is off and the best feature yet - three dimensional stacking.
Stacking memory in three dimensions will massively increase the memory density per unit of volume in future generation systems.
The memristor is designed to achieve is RAM speed, with the capacity of permanent data storage like the flash memory.
These three components of the machine, along with numerous other's still in development over at HP labs combine to create a computer that's six times more powerful than current generation servers, while it requires eighty times less energy to run and operate.
Furthermore, this flexible architecture allows the Machine to be integrated in any type of computing device from phones and tablets to data centre servers and supercomputers.
And since my technical knowledge is on it's limit to translate the awesomeness of this projects, take a look at the full presentation from HP Discover 2014.
No comments:
Post a Comment