Changes between Version 26 and Version 27 of GEC21Agenda/EveningDemoSession


Ignore:
Timestamp:
10/20/14 11:07:55 (10 years ago)
Author:
xuan.liu@mail.umkc.edu
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • GEC21Agenda/EveningDemoSession

    v26 v27  
    143143  * Derek Meyer, dmeyer@cs.wisc.edu, Wisconsin Wireless and NetworkinG Systems (WiNGS) Laboratory
    144144
    145 ==== Middleware for Hadoop-in-a-Hybrid-Cloud ====
     145==== Hadoop-in-a-Hybrid-Cloud ====
    146146
    147 !MapReduce is a programming model for processing and generating large data sets, and Hadoop, a !MapReduce implementation, is a good tool to handle Big Data. Cloud computing with its ubiquitous characteristic, on demand and dynamic resource provisioning at low cost has potential to be the environment to treat big data. However, using Hadoop on the cloud spends time and requires technical knowledge from users. The hybrid cloud leverages these requirements, because it’s necessary to evaluate the resources in private cloud and, if necessary, obtain and prepare on-demand resources in the public cloud. Moreover, the simultaneous management of private and public domains requires an appropriate model that combines performance with minimal cost. We propose an architecture to make the orchestration of Hadoop applications in hybrid clouds. The core of the model consists of a web portal for submissions, an orchestration engine and an execution services factory. Through these three components it’s possible to automate the preparation of a cross-domain cluster, performing the provisioning of files involved, managing the execution of the application, and making the results available to the user. In this demo, we will show the web portal interface and how to use this portal to create VMs and initialze it as a Hadoop worker.
     147!MapReduce is a programming model for processing and generating large data sets, and Hadoop, a !MapReduce implementation, is a good tool to handle Big Data. Cloud computing with its ubiquitous characteristic, on demand and dynamic resource provisioning at low cost has potential to be the environment to treat big data. However, using Hadoop on the cloud spends time and requires technical knowledge from users. The hybrid cloud leverages these requirements, because it’s necessary to evaluate the resources in private cloud and, if necessary, obtain and prepare on-demand resources in the public cloud. Moreover, the simultaneous management of private and public domains requires an appropriate model that combines performance with minimal cost. We propose an architecture to make the orchestration of Hadoop applications in hybrid clouds, which include a private cloud at Unicamp and UMKC, and GENI as the public cloud. The core of the model consists of a web portal for submissions, an orchestration engine and an execution services factory. Through these three components it’s possible to automate the preparation of a cross-domain cluster, performing the provisioning of files involved, managing the execution of the application, and making the results available to the user. In this demo, we will show the web portal interface and how to use this portal to create VMs and initialze it as a Hadoop worker.
    148148
    149149Participants: