@Marif: I agree! This is some kind of a trial and error method in finding solutions for problems. I understand what you and Sunita are trying to drive here. It is better to immediately come up with a solution. Although there is a big possibility that it won't solve the problem, at least something ie being done. At least you are being productive. What you then need to do is fix or correct the errors so that you can come up with a better, more effective solution...It is better to act right away instead of waster hours upon hours thinking of solutions you're not sure will work out!
You are right SunitaT, I would also prefer to spend the initial time to have an immediate fix to the problem even if it is not a proper solution so that if the error is critical it will not affect the other areas as well. After the initial resolution or patch, the in-depth analysis should be done to stop any possibility of re-occurrence. In an End of Day of any solution has stopped due to an error, the first preference should be to get it completed before the days starts so that it will not halt the process. After that the malfunction can be identified and rectified.
To solve a problem, the main issue has to be scrutinized deeply. Given a time frame of lets say one day one would rather spend 18 hours to find the main cause of the problem in order to solve the problem exhaustively. The methodology of trying options my lead to magnifying the problem even more at least according to my analysis.
I think that's a bit over the top but yes you have a point. Data that I might think is garbage might be valuable to someone else but my take after some projects that centered around getting storage and data under control is that if you haven't touched it in a year and it takes 10 minutes to re-create then there's no reason for it to be taking up space. Also it's doubtful that you need 10 copies of the same document spread out over your network. There are many little tricks to cleaning up data stores.
The blogs and comments posted on EnterpriseEfficiency.com do not reflect the views of TechWeb, EnterpriseEfficiency.com, or its sponsors. EnterpriseEfficiency.com, TechWeb, and its sponsors do not assume responsibility for any comments, claims, or opinions made by authors and bloggers. They are no substitute for your own research and should not be relied upon for trading or any other purpose.
Enterprise Efficiency is looking for engaged readers to moderate the message boards on this site. Engage in high-IQ conversations with IT industry leaders; earn kudos and perks. Interested? E-mail: firstname.lastname@example.org
Dell's Efficiency Modeling Tool The major problem facing the CIO is how to measure the effectiveness of the IT department. Learn how Dell’s Efficiency Modeling Tool gives the CIO two clear, powerful numbers: Efficiency Quotient and Impact Quotient. These numbers can be transforma¬tive not only to the department, but to the entire enterprise. Read the full report
Now that TGen has broken new ground in genomic research by using Dell's storage, cloud, and high-performance computing solutions, the company discusses what will come next for it and for personalized medicine.
The Translational Genomics Research Institute wanted to save lives, but its efforts were hobbled by immense computing challenges related to collecting, processing, sharing, and storing enormous amounts of data.
Office and personal productivity tools come in a first-class and coach flavor set, but what makes the difference is primarily little things that most users won't encounter. What's the big issue in using something other than Office, and can you get around it?
We really don't want an "Internet of Everything" but even building an Internet of Everythinguseful means setting some ground rules to insure there's value in the process and that costs and risks are minimized.
Google's Chrome OS has a lot of potential value and a lot of recent press, but it still needs something to make it more than a thin client. It needs cloud integration, it needs extended APIs via web services, and it needs to suck it up and support a hard drive.
On a recent African trip I saw examples of the value of the cloud in developing nations, for educational and community development programs. We could build on this, but not only in developing economies, because these same programs are often under-supported even in first-world countries.
VMware's debate with Cisco on SDN might finally create a fusion between an SDN view that's all about software and another that's all about network equipment. That would be good for every enterprise considering the cloud and SDN.
Wearing a bulky, oversized watch is good training for the next phase in wristwatches: the Internet-enabled, connected watch. Why the smartphone-tethered connected watch makes sense, plus Ivan demos an entirely new concept for the "smart watch."
Cloud storage costs are determined primarily by the rate at which files are changed and the possibility of concurrent access/update. If you can structure your storage use to optimize these factors you can cut costs, perhaps to zero.
The Internet has evolved into a machine for drumming up a chorus of "Happy Birthday" messages, from family, friends, friends of friends who you added on Facebook, random people that you circled on G+, and increasingly, automated bots. Enough already.