Apache Hadoop* Community Spotlight: Apache HDFS*

للأسف فإن ملف PDF هذا متوافر فقط بهيئة قابلة للتنزيل

Konstantin Shvachko, Project Management Committee member for the Apache Hadoop* framework and founder of AltoScale, demystifies the Apache* Hadoop* Distributed File System (HDFS*) and talks about where software development is headed. HDFS is the primary distributed storage component used by applications under the Apache open-source project Hadoop. This overview by an expert from the Apache Hadoop open-source community explains the four design principles that drive development, how HDFS works, why it’s so well suited for handling large unstructured data sets, and where the software is headed. Part of the Intel® IT Center’s Hadoop Community Spotlight series. Also listen to the podcast of the interview.