Custom Query (1408 matches)

Filters
 
Or
 
  
 
Columns

Show under each result:


Results (52 - 54 of 1408)

Ticket Resolution Summary Owner Reporter
#237 fixed ViSE GEC6 Demo NIDHI David Irwin
Description
  1. Brief demo description (A few sentences suitable for a GEC6 attendee information page).

We will be presenting ViSE's user web portal, which is integrated with Orca and its remote Clearinghouse run at RENCI in Chapel Hill, NC. The portal provides a simple interface for users to request access to ViSE nodes and its sensors from the GENI Clearinghouse.

  1. List of equipment that will need AC connections (e.g. laptop, switch hardware prototype, monitor). (Just put in the number of connections needed if your demo description alrey lists the equipment)

We will need at least 4 A/C connections and a connection to the Internet. We also need a projector. We will bring a portable projector.

  1. Number of wired network connections (include required bandwidth if significant)

1 wired connection.

  1. Number of wireless network connections (include required bandwidth if significant)

1 wireless connection.

  1. Projector (y/n) (Bring your own projectors if feasible)

No. We will bring our own projector.

  1. Number of posters

1 poster describing ViSE.

  1. Number of static addresses

None required.

  1. Description of any special requests (e.g. VLANS to I2 backbone)

No special requests.

#137 fixed ViSE GEC5 Demonstration hdempsey@geni.net David Irwin
Description

The ViSE project is demonstrating the integration of virtualized sensors with the latest release of the Orca control framework as well as an initial integration with the DOME-GENI testbed. ViSE and DOME will both be deployed on geni.cs.umass.edu, using the same instance of the Orca control framework/clearinghouse. ViSE will demonstrate its sensor controller remotely (on geni.cs.umass.edu) to show the multiplexing of a sensor (in this case a PTZ camera) between two competing experiments. Additionally, we will demonstrate our progress toward radar virtualization.

The demonstration will require two A/C outlets, an Internet connection (to interact with geni.cs.umass.edu), and a projector screen. We will supply the projector.

#49 fixed ViSE GEC4 Demonstration David Irwin David Irwin
Description

The ViSE project will demonstrate sensor control using the Orca control framework, sensor scheduling, and our progress toward sensor virtualization. The specific demonstration description below is subject to some changes/simplifications, depending on our progress near the end of March. The primary demonstration is #1 below, while #2 and #3 will be subject to progress at that point.

Requirements: We will require 2 A/C outlets for plugging in two sensors and two laptops (we will bring a power strip ourselves). We will need table space for our equipment and tack space if a poster is required for GEC4. We will also need wireless access for #3 below. We would like to be placed next to the Orca/BEN project from RENCI/Duke. We would prefer a large monitor to plug our laptop(s) into, as we will not be transporting a monitor ourselves, but this is not required.

  1. We will bring a Pan-Tilt-Zoom (PTZ) video camera and a DavisPro weather station to GEC4, to act as example sensors (Note: our radars are too large to transport to Miami). We will conduct the demonstration on a single laptop with the PTZ video camera connected via Ethernet. The laptop will be a "GENI in a bottle": we will run 4 VMware virtual machines on the laptop, each of which corresponds to 1 GENI Aggregate Manager (or Orca site authority) and GENI Clearinghouse (or Orca broker), 2 GENI Experiments, and 1 GENI component that the aggregate manager controls. The GENI component will be a VMware virtual machine that runs an instance of the Xen virtual machine monitor inside of it. The GENI Aggregate manager will be empowered to create slivers as Xen virtual machines on the GENI component (since we have only a single component in the demonstration, these slivers correspond to a slice). The Experiments will communicate with the Clearinghouse and Aggregate manager using SOAP network communication.

Importantly, the GENI component VM will be able to control access to the PTZ camera by attaching and detaching virtual network interfaces to/from experiment VMs. Each experiment will request a slice composed of a single Xen VMM sliver with a reserved proportion of CPU, memory, bandwidth, etc. The experiments will then compete for control of, and access to, the PTZ camera by requesting a lease for it from the Clearinghouse and directing the Aggregate Manager to attach it (in the form of a virtual network interface) to their sliver---only a single Experiment can control the camera at one time so the Clearinghouse must schedule access to it accordingly. We will use the default Orca web portal to display the process, and the PTZ camera web portal on both experiment's to show the status of the camera.

  1. We will also show our progress on true sensor virtualization in the Xen virtual machine monitor. In the case of the camera, the "virtualization" takes the form of permitting full access to the camera by one, and only one, VM through its virtual network interface. We are currently integrating virtualized sensing devices into Xen's device driver framework. We will show our progress towards "virtualizing" a Davis Pro weather station that physically connects to a virtual USB port. Our initial goal along this thread is to have the Davis Pro software run inside of a Xen VM on top of a virtual serial driver that "passes through" requests to the physical device. This is the first step towards our milestones near the end of the year for sensor slivering.
  1. While the above demonstrations will be local, we will also have access to our testbed in Massachusetts, running the Orca software, available, as per our milestone in February. We also hope to demonstrate the basic capabilities of our weather radar remotely, using a standard reflectivity map.
Note: See TracQuery for help on using queries.