Noticias y opinión en Sistemas distribuidos
While the fast path to cloud computing varies, it looks like many enterprises, as well as consumers, were already on the way before they really knew what cloud computing meant.
Sixty-eight percent of information workers who built an app on their own said they completed the work in less than a week.
From more affordable virtualization to cloud everything to gigabit wireless to NAS appliances, small businesses have a lot to watch for in 2012
(PhysOrg.com) -- Researchers have a startlingly upbeat idea for data center managers coping with packed rooms, Internet traffic bursts, and high costs looming in having to reconfigure data center designs. The researchers find that data centers can use ceilings to bounce off data signals. Doing so enhances data transmission speeds by 30 percent.
Developers, developers, developers, developers
Un 56% de las compañías aboga por un modelo de cloud computing para obtener mayor control sobre su información.
Less and less of today's computing is done on desktop computers; cloud computing, in which operations are carried out on a network of shared, remote servers, is expected to rise as the demand for computing power increases. This raises some crucial questions about security: Can we, for instance, perform computations on data stored in 'the cloud' without letting anyone else see our information? Research carried out at the Weizmann Institute and MIT is moving us closer to the ability to work on data while it is still encrypted, giving an encrypted result that can later be securely deciphered.
Forrester Research analyst Sarah Rotman Epps puts aside the tablet talk to discuss five computing form factors she sees possibly gaining momentum in the future.
Independa has unveiled its Artemis system of sensors to allow for remote monitoring of seniors' vital data in the cloud.
Hubspan predicted TCO would continue to be the primary driver of cloud adoption due to scalable costs.
With FedRAMP, federal agencies will be able to evaluate and monitor cloud providers to ensure their services meet minimum security standards.
Una nueva herramienta desarrollada por Antonio J. Peña, alumno de doctorado de la Universitat Politècnica de València y la Universitat Jaume I de Castelló, fue una de las protagonistas en la exposición SuperComputing 2011, celebrada en Seattle. El desarrollo permite el acceso remoto a aceleradores gráficos en un cluster de computadores de altas prestaciones y permite que los centenares o miles de nodos que conforman un cluster compartan los acelerados gráficos instalados en él, con el consiguiente ahorro en energía y mantenimiento, señala el investigador.