Ahan. Processes and tasks are no doubt different in pharma and FS but certainly there is a correlation of processes which is similar to any other organization for e.g. a purchase order is raised in pharma instead of account opening request in FS. The subsequent authorizations are more or less the same. Therefore, there shouldnt be a problem in benchmarking networking and processing of one sector with other.
@waqasaltaf- I recently interviewed an expert in low latency networking. He mentioned that some of the networking and processing being pioneered in financial services is now heading to pharmaceutical companies.
The main emphasis of financial services industry is on realtime processing and efficiency and this need is way above the requirements of other industries. One industry I can think of which requires realtime processing considerably is Aviation. FMCGs and healthcare also do require high speed processing but it is uncomparable to the requirements in financial services.
Enterprise Efficiency is looking for engaged readers to moderate the message boards on this site. Engage in high-IQ conversations with IT industry leaders; earn kudos and perks. Interested? E-mail: firstname.lastname@example.org
Now that TGen has broken new ground in genomic research by using Dell's storage, cloud, and high-performance computing solutions, the company discusses what will come next for it and for personalized medicine.
The Translational Genomics Research Institute wanted to save lives, but its efforts were hobbled by immense computing challenges related to collecting, processing, sharing, and storing enormous amounts of data.
At the GigaOM Structure conference, a startup announced a cloud and virtualization storage optimizing approach that shows there's still a lot of thinking to be done on the way storage joins the virtual world.
We always hear about "Big" data, but a real issue in cloud storage is not just bigness but also persistence. A large data model is less complicated than a big application repository that somehow needs to be accessed. The Hadoop send-program-to-data model may be the answer.
EMC's Project Lightning has matured into a product set, and it's important, less because it has new features or capabilities in storage technology and management, than because it may package the state of the art in a way more businesses can deploy.