December 04, 2006
To maximize the intended benefits of
service-oriented architecture (SOA) implementations, organizations need
to develop a well-articulated SOA quality strategy to promote trust and
reuse, according to new research from analyst firm Hurwitz &
Associates. The report, titled "Executive Survey: SOA Implementation
Satisfaction," authored by Carol Baroudi and Dr. Fern Halper, confirms
the top drivers for SOA adoption include the expectation of greater
reuse in existing and newly-built Web services, business flexibility,
ease of integration and speed of integration -- with nearly 90 percent
of respondents pointing to service reuse as their number one concern.
"The Hurwitz research suggests that organizations that implement a SOA and have a quality plan in place are more likely to be completely satisfied with the quality of their SOA," said report co-author, Carol Baroudi of Hurwitz & Associates. "Creating a viable SOA is contingent on the ability to ensure quality and gain trust. Without an articulated quality strategy rigorously enforced, companies will be hard-pressed to scale their SOA implementations and realize the intended benefits."
In September and October of 2006, Hurwitz & Associates surveyed ninety-nine IT executives from companies in North America and the UK with a size greater than 250 employees who had expressed an interest in SOA or Web services. Approximately two-thirds of respondents surveyed (66 percent) had begun their SOA journey. The research notes that nearly half (47 percent) of the respondents, who had implemented SOA cited some dissatisfaction with their reuse due to a lack of planning and business goals and a lack of understanding of what services are available for reuse. Lack of governance and lack of standards also contributed to dissatisfaction with SOA implementations. Respondents who had implemented a SOA and had a quality plan in place were more likely to be completely satisfied with the quality of their SOA.
The independent research was sponsored by Mindreef Inc., a provider of SOAPscope and SOAPscope Server solutions for Web services testing and SOA quality. Teams at Wachovia, Fidelity National Financial, Charles Schwab, IBM, Valero Energy, and more than 3,000 customers at over 1,200 organizations worldwide use Mindreef products to build, test, and maintain Web services and SOAs.
"We firmly believe that SOA Quality is achieved through continual optimization of all components within an SOA environment to ensure maximum adoption, business agility and service reuse," said Frank Grossman, co-founder and president of Mindreef. "The research underscores the importance of SOA Quality as a key component of reaping the intended benefits of an SOA -- it's the strategy needed to achieve maximum business benefit -- and that companies will be more satisfied if they implement a well-articulated plan to drive pervasive quality throughout their SOA initiatives, to obtain the agility and reuse that yield a significant and immediate ROI."
Although the SOA registry and repository are instrumental to promoting reuse, nearly 50 percent of respondents stated that they have no registry or repository solution in place or use an in-house solution. According to Hurwitz & Associates, the functionality of both is critical to promote reuse and that companies should look for a standards-based product to provide these functions. The research also suggests that companies with a formal solution for both a registry and repository are more likely to have their expectations for reuse met than those that have no solution.
The Hurwitz & Associates analysis further notes that organizations will benefit by better understanding the nature of reuse of services in an SOA, as this opens new dimensions of possibilities sourced in dynamic combinations of components in an environment where not all combinations will ever be tested. With the testing of SOA environments "taking a quantum leap in complexity," the trust needed for SOA to attain widespread use will be built from the successful use of well-designed and well-tested services.
"Hurwitz & Associates believes that creating a viable SOA initiative is contingent upon the ability to ensure software quality and gain a trust of all constituents," said report co-author Dr. Fern Halper of Hurwitz & Associates. "Organizations keen on reusing software components need to ensure they have been designed for reuse, tested in a SOA environment, described and published in an easily searchable registry/repository, and are compliant with policies, procedures and regulations that govern their use."
For more information or to download a free copy of the report, visit www.mindreef.com/report.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.