Energy efficiency and high performance are both things that need to be balanced out to fulfill today's enormous data storage and processing requirement. Also, close coordination between companies like Dell and Intel is necessary. If Intel produces processors with certain specifications which Dell is'nt ready to install it into its hardware, then Intel is at a loss. Advance approval of Dell before commencing production is required as Intel is on the backend of the supply chain and end-product manufacturers like Dell are its consumers. So both Intel and Dell jointly have to decide what balance of energy efficiency and high performance is appropriate.
Enterprise Efficiency is looking for engaged readers to moderate the message boards on this site. Engage in high-IQ conversations with IT industry leaders; earn kudos and perks. Interested? E-mail: firstname.lastname@example.org
Now that TGen has broken new ground in genomic research by using Dell's storage, cloud, and high-performance computing solutions, the company discusses what will come next for it and for personalized medicine.
The Translational Genomics Research Institute wanted to save lives, but its efforts were hobbled by immense computing challenges related to collecting, processing, sharing, and storing enormous amounts of data.
At the GigaOM Structure conference, a startup announced a cloud and virtualization storage optimizing approach that shows there's still a lot of thinking to be done on the way storage joins the virtual world.
We always hear about "Big" data, but a real issue in cloud storage is not just bigness but also persistence. A large data model is less complicated than a big application repository that somehow needs to be accessed. The Hadoop send-program-to-data model may be the answer.
EMC's Project Lightning has matured into a product set, and it's important, less because it has new features or capabilities in storage technology and management, than because it may package the state of the art in a way more businesses can deploy.