November 05, 2012
SANTA CLARA, Calif., Nov. 5 – As big data, high-performance computing (HPC) and cloud-computing applications push the demand for real-time access of data into the zettabytes, Intel Corporation announced today its next-generation data center solid-state drive (SSD), the Intel Solid-State Drive DC S3700 Series, designed to remove storage bottlenecks and maximize multi-core CPU performance. The Intel SSD DC S3700 Series delivers fast, consistent performance and low latencies along with strong data protection and high endurance to help IT personnel support today’s most demanding data center applications.
“Today’s data explosion creates unique storage challenges for data center professionals,” said Rob Crooke, Intel vice president and general manager for the Intel Non-Volatile Memory (NVM) Solutions Group. “High latencies and slow storage I/O can cripple data centers’ ability to deliver exciting big data or cloud-computing applications with fast, low latency data access. Intel’s next-generation Intel SSD DC S3700 Series breaks through SSD limitations for the data center on all fronts – fast, consistent performance, strong data protection and high endurance -- so IT professionals can deliver on their most demanding technology initiatives.”
The key to the Intel SSD DC S3700 Series superior and consistent fast performance is a tight distribution of Input/Outputs Per Second (IOPS) with low maximum latencies. The Intel SSD DC S3700 feeds I/O-starved applications with 4KB random read performance of up to 75,000 IOPS and 4KB write performance of up to 36,000 IOPS. With typical sequential write latency of 65 microseconds and high Quality of Service (QOS) of less than 500 microseconds 99.9 percent of the time, the Intel SSD DC S3700 ensures quick and consistent application response times.
This accelerated storage performance gives parallel multithreaded computing increased storage throughput to keep multicore CPUs more active. This reduces lapses in response time for end users for a smoother computing experience. For IT/data center professionals who must also worry about data protection and maximum security, the Intel SSD DC S3700 Series offers full end-to-end data protection and 256-bit Advanced Encryption Standard (AES) capability. To further improve reliability, the Intel SSD DC S3700 incorporates an array of surplus flash memory used for data redundancy to minimize potential data loss.
The drive incorporates Intel High Endurance Technology (HET) to deliver single-level cell SSD-like endurance in more cost-effective multi-level cell (MLC) technology. By combining SSD NAND management techniques with NAND silicon enhancements, HET enables the Intel SSD DC S37000 Series to achieve 10 full drive writes per day over the 5-year life of the drive. This is the equivalent of recording more than 186 years of HD video over the life of the highest capacity 800GB drive.
In addition, SSDs also improve overall power consumption. The Intel SSD DC 3700 Series reduces typical active power consumption to 6 watts and idle 650 milliwatts, which reduces heat and therefore lowers both energy and cooling costs.
The Intel SSD DC S3700 Series is a 6 gigabit-per-second (Gbps) SATA drive with performance transfer rates of 500 megabyte per second (MB/s) reads and 460 MB/s writes. It delivers up to 2x read and 15x write performance over its previous generation Intel SSD 710 Series. Samples of the product are now available for data center customers to begin quality and validation cycles. General production availability is expected to begin by the end of the year, with volume production in the first quarter of 2013.
The product comes in a 2.5-inch form factor for all capacities, 100, 200, 400 and 800 gigabytes (GB), and in a 1.8-inch form factor in 200GB and 400GB capacities. The recommended channel pricing (MSRP) for the 2.5-inch Intel SSD DC S3700 Series is as follows: $235 for 100 (GB) capacity; $470 for 200GB; $940 for 400GB and $1,880 for 800GB based on 1,000-unit quantities. The 1.8-inch drive MSRP pricing is $495 for the 200GB capacity and $965 for 400GB. Prices include a 5-year limited warranty.
Intel is a world leader in computing innovation. The company designs and builds the essential technologies that serve as the foundation for the world’s computing devices.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.