[[PageOutline]]
= University of Florida (UFL) ExoGENI Confirmation Tests =
For details about the tests in this page, see the '''[wiki:GENIRacksHome/ExogeniRacks/SiteConfirmationTests ExoGENI Confirmation Tests]''' page.[[BR]]
For site status see the '''[wiki:GENIRacksHome/ExogeniRacks/ConfirmationTestStatus ExoGENI New Site Confirmation Tests Status]''' page.
__Note:__ Omni nick_names for site aggregates used for these tests are:
{{{
ufl-eg=urn:publicid:IDN+exogeni.net:uflvmsite+authority+am,https://ufl-hn.exogeni.net:11443/orca/xmlrpc
ufl-eg-of=urn:publicid:IDN+openflow:foam:ufl-hn.exogeni.net+authority+am,https://ufl-hn.exogeni.net:3626/foam/gapi/2
eg-sm=urn:publicid:IDN+exogeni.net+authority+am,https://geni.renci.org:11443/orca/xmlrpc
}}}
== EG-CT-1 - Access to New Site VM resources ==
First determine orca_version running at this aggregate:
{{{
$ omni.py -a ufl-eg getversion
09:25:44 INFO omni: Downloaded latest `agg_nick_cache` from 'http://trac.gpolab.bbn.com/gcf/raw-attachment/wiki/Omni/agg_nick_cache' and copied to '/home/lnevers/.gcf/agg_nick_cache'.
09:25:44 INFO omni: Loading agg_nick_cache file '/home/lnevers/.gcf/agg_nick_cache'
09:25:44 INFO omni: Loading config file /home/lnevers/.gcf/omni_config
09:25:44 INFO omni: Setting option useSliceMembers based on omni_config setting
09:25:44 INFO omni: Using control framework portal
09:25:44 INFO omni: Member Authority is https://ch.geni.net/MA (from config)
09:25:44 INFO omni: Slice Authority is https://ch.geni.net/SA (from config)
09:25:44 INFO omni: Substituting AM nickname ufl-eg with URL https://ufl-hn.exogeni.net:11443/orca/xmlrpc, URN urn:publicid:IDN+exogeni.net:uflvmsite+authority+am
09:25:45 INFO omni: AM ufl-eg URN: urn:publicid:IDN+exogeni.net:uflvmsite+authority+am (url: https://ufl-hn.exogeni.net:11443/orca/xmlrpc) has version:
09:25:45 INFO omni: { 'geni_ad_rspec_versions': [ { 'extensions': [ 'http://hpn.east.isi.edu/rspec/ext/stitch/0.1/stitch-schema.xsd',
'http://www.protogeni.net/resources/rspec/ext/emulab/1/ptop_extension.xsd'],
'namespace': 'http://www.geni.net/resources/rspec/3',
'schema': 'http://www.geni.net/resources/rspec/3/ad.xsd',
'type': 'GENI',
'version': '3'}],
'geni_am_type': 'orca',
'geni_api': 2,
'geni_api_versions': { '2': 'https://ufl-hn.exogeni.net:11443/orca/xmlrpc/geni'},
'geni_request_rspec_versions': [ { 'extensions': [ 'http://www.geni.net/resources/rspec/ext/shared-vlan/1',
'http://www.geni.net/resources/rspec/ext/postBootScript/1'],
'namespace': 'http://www.geni.net/resources/rspec/3',
'schema': 'http://www.geni.net/resources/rspec/3/request.xsd',
'type': 'GENI',
'version': '3'}],
'orca_version': 'ORCA Dungeness: v.4.0-SNAPSHOT.build-5996'}
09:25:45 INFO omni: ------------------------------------------------------
09:25:45 INFO omni: Completed getversion:
Args: getversion
Result Summary:
Got version for ufl-eg
09:25:45 INFO omni: ======================================================
}}}
Compared with versions in other racks, the following version were found:
|| Rack || Version||
|| BBN || 'ORCA Dungeness: v.4.0-SNAPSHOT.build-6124'||
||RENCI || 'ORCA Dungeness: v.4.0-SNAPSHOT.build-5996'||
||UFL || 'ORCA Dungeness: v.4.0-SNAPSHOT.build-5949'||
||ExoSM || 'ORCA Dungeness: v.4.0-SNAPSHOT.build-5996'||
|| FIU || 'ORCA Dungeness: v.4.0-SNAPSHOT.build-6124'||
|| UH || 'ORCA Dungeness: v.4.0-SNAPSHOT.build-6241'||
Created a slice:
{{{
$ omni.py createslice EG-CT-1
12:07:19 INFO omni: Loading agg_nick_cache file '/home/lnevers/.gcf/agg_nick_cache'
12:07:19 INFO omni: Loading config file /home/lnevers/.gcf/omni_config
12:07:19 INFO omni: Using control framework portal
12:07:21 INFO omni: Created slice with Name EG-CT-1, URN urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-1, Expiration 2013-11-29 17:07:20
12:07:21 INFO omni: ------------------------------------------------------------
12:07:21 INFO omni: Completed createslice:
Args: createslice EG-CT-1
Result Summary: Created slice with Name EG-CT-1, URN urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-1, Expiration 2013-11-29 17:07:20
12:07:21 INFO omni: ============================================================
}}}
Created a 4 VMs sliver using the RSpec [http://groups.geni.net/geni/browser/trunk/GENIRacks/ExoGENI/Spiral5/RSpecs/ConfirmationTests/UFL/EG-CT-1-ufl.rspec EG-CT-1-ufl.rspec]
{{{
$ omni.py createsliver -a ufl-eg EG-CT-1 EG-CT-1-ufl.rspec
12:07:21 INFO omni: Loading agg_nick_cache file '/home/lnevers/.gcf/agg_nick_cache'
12:07:21 INFO omni: Loading config file /home/lnevers/.gcf/omni_config
12:07:21 INFO omni: Using control framework portal
12:07:21 INFO omni: Substituting AM nickname ufl-eg with URL https://ufl-hn.exogeni.net:11443/orca/xmlrpc, URN urn:publicid:IDN+exogeni.net:uflvmsite+authority+am
12:07:23 INFO omni: Slice urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-1 expires on 2013-11-29 17:07:20 UTC
12:07:23 INFO omni: Creating sliver(s) from rspec file EG-CT-1-ufl.rspec for slice urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-1
12:07:28 INFO omni: Got return from CreateSliver for slice EG-CT-1 at https://ufl-hn.exogeni.net:11443/orca/xmlrpc:
12:07:28 INFO omni:
12:07:28 INFO omni:
12:07:28 INFO omni:
12:07:28 INFO omni: ------------------------------------------------------------
12:07:28 INFO omni: Completed createsliver:
Args: createsliver EG-CT-1 EG-CT-1-ufl.rspec
Result Summary: Got Reserved resources RSpec from exogeni-net-uflvmsite
12:07:28 INFO omni: ============================================================
}}}
When the sliver was ready determine login information:
{{{
$ readyToLogin.py -a ufl-eg EG-CT-1
<....>
================================================================================
LOGIN INFO for AM: https://ufl-hn.exogeni.net:11443/orca/xmlrpc
================================================================================
For more login info, see the section entitled:
'Providing a private key to ssh' in 'readyToLogin.py -h'
VM-4's geni_status is: ready (am_status:ready)
User lnevers logs in to VM-4 using:
ssh -i /home/lnevers/.ssh/geni_cert_portal_key lnevers@128.227.10.29
VM-1's geni_status is: ready (am_status:ready)
User lnevers logs in to VM-1 using:
ssh -i /home/lnevers/.ssh/geni_cert_portal_key lnevers@128.227.10.20
VM-2's geni_status is: ready (am_status:ready)
User lnevers logs in to VM-2 using:
ssh -i /home/lnevers/.ssh/geni_cert_portal_key lnevers@128.227.10.24
VM-3's geni_status is: ready (am_status:ready)
User lnevers logs in to VM-3 using:
ssh -i /home/lnevers/.ssh/geni_cert_portal_key lnevers@128.227.10.21
}}}
=== Measurements ===
Login to two of the nodes and collect iperf and ping statistics. All measurements are collected over 60 seconds:
''Collected: 2013-11-22''
'''Iperf ExoGENI VM-2 to VM-1 (TCP) - TCP window size: 16.0 KB '''
__One Client_
{{{
[ ID] Interval Transfer Bandwidth
[ 3] 0.0-60.0 sec 18.9 GBytes 2.70 Gbits/sec
}}}
__Five Clients__
{{{
[ ID] Interval Transfer Bandwidth
[ 4] 0.0-60.0 sec 7.76 GBytes 1.11 Gbits/sec
[ 5] 0.0-60.0 sec 7.73 GBytes 1.11 Gbits/sec
[ 6] 0.0-60.0 sec 7.56 GBytes 1.08 Gbits/sec
[ 3] 0.0-60.0 sec 7.60 GBytes 1.09 Gbits/sec
[ 7] 0.0-60.0 sec 7.81 GBytes 1.12 Gbits/sec
[SUM] 0.0-60.0 sec 38.5 GBytes 5.51 Gbits/sec
}}}
__Ten Clients__
{{{
[ ID] Interval Transfer Bandwidth
[ 4] 0.0-60.0 sec 4.13 GBytes 591 Mbits/sec
[ 5] 0.0-60.0 sec 3.98 GBytes 569 Mbits/sec
[ 6] 0.0-60.0 sec 4.03 GBytes 577 Mbits/sec
[ 8] 0.0-60.0 sec 4.13 GBytes 592 Mbits/sec
[ 9] 0.0-60.0 sec 4.00 GBytes 573 Mbits/sec
[ 11] 0.0-60.0 sec 4.08 GBytes 585 Mbits/sec
[ 3] 0.0-60.0 sec 4.01 GBytes 574 Mbits/sec
[ 7] 0.0-60.0 sec 4.03 GBytes 577 Mbits/sec
[ 10] 0.0-60.0 sec 4.14 GBytes 592 Mbits/sec
[ 12] 0.0-60.0 sec 3.98 GBytes 570 Mbits/sec
[SUM] 0.0-60.0 sec 40.5 GBytes 5.80 Gbits/sec
}}}
''Collected: 2014-03-31''
'''Iperf ExoGENI VM-2 to the VM-1 (UDP) - 1470 byte datagrams & UDP buffer size: 136 KByte '''
{{{
[ ID] Interval Transfer Bandwidth
[ 3] 0.0-60.0 sec 7.16 MBytes 1000 Kbits/sec
[ 3] Sent 5104 datagrams
[ 3] Server Report:
[ 3] 0.0-60.2 sec 7.16 MBytes 998 Kbits/sec 13.541 ms 0/ 5104 (0%)
}}}
'''Ping ExoGENI VM-2 to the VM-1 '''
{{{
60 packets transmitted, 60 received, 0% packet loss, time 58998ms
rtt min/avg/max/mdev = 0.269/10.396/598.553/76.571 ms
}}}
Note: Re-ran UDP Test on 04/18/2014 and results were: 810 Mbits/sec
== EG-CT-2 - Access to New Site bare metal and VM resources ==
Create a slice:
{{{
$ omni.py createslice EG-CT-209:49:04 INFO omni: Loading agg_nick_cache file '/home/lnevers/.gcf/agg_nick_cache'
09:49:04 INFO omni: Loading config file /home/lnevers/.gcf/omni_config
09:49:04 INFO omni: Using control framework portal
09:49:05 INFO omni: Created slice with Name EG-CT-2, URN urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-2, Expiration 2013-12-09 13:27:57
09:49:05 INFO omni: ------------------------------------------------------------
09:49:05 INFO omni: Completed createslice:
Args: createslice EG-CT-2
Result Summary: Created slice with Name EG-CT-2, URN urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-2, Expiration 2013-12-09 13:27:57
09:49:05 INFO omni: ============================================================
}}}
Create a sliver with one VM and one bare metal node using RSpec [http://groups.geni.net/geni/browser/trunk/GENIRacks/ExoGENI/Spiral5/RSpecs/ConfirmationTests/UFL/EG-CT-2-ufl.rspec EG-CT-2-ufl.rspec]. Note: Bare metal only available via ExoSM.
{{{
$ omni.py createsliver EG-CT-2 -a eg-sm ./EG-CT-2-ufl.rspec
09:58:21 INFO omni: Loading agg_nick_cache file '/home/lnevers/.gcf/agg_nick_cache'
09:58:21 INFO omni: Loading config file /home/lnevers/.gcf/omni_config
09:58:21 INFO omni: Using control framework portal
09:58:21 INFO omni: Substituting AM nickname eg-sm with URL https://geni.renci.org:11443/orca/xmlrpc, URN urn:publicid:IDN+exogeni.net+authority+am
09:58:23 INFO omni: Slice urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-2 expires on 2013-12-09 13:27:57 UTC
09:58:23 INFO omni: Creating sliver(s) from rspec file ./EG-CT-2-ufl.rspec for slice urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-2
09:58:30 INFO omni: Got return from CreateSliver for slice EG-CT-2 at https://geni.renci.org:11443/orca/xmlrpc:
09:58:30 INFO omni:
09:58:30 INFO omni:
09:58:30 INFO omni:
09:58:30 INFO omni: ------------------------------------------------------------
09:58:30 INFO omni: Completed createsliver:
Args: createsliver EG-CT-2 ./EG-CT-2-ufl.rspec
Result Summary: Got Reserved resources RSpec from exogeni-net
09:58:30 INFO omni: ============================================================
}}}
When sliver is ready, check for login information:
{{{
$ readyToLogin.py EG-CT-2 -a eg-sm
<...>
================================================================================
LOGIN INFO for AM: https://geni.renci.org:11443/orca/xmlrpc
================================================================================
For more login info, see the section entitled:
'Providing a private key to ssh' in 'readyToLogin.py -h'
BM-1's geni_status is: ready (am_status:ready)
User lnevers logs in to BM-1 using:
ssh -i /home/lnevers/.ssh/geni_cert_portal_key lnevers@128.227.10.125
VM-1's geni_status is: ready (am_status:ready)
User lnevers logs in to VM-1 using:
ssh -i /home/lnevers/.ssh/geni_cert_portal_key lnevers@128.227.10.32
}}}
=== Measurements ===
''Collected: 2013-12-02''
'''Iperf ExoGENI BM-1 to VM-1 (TCP) - TCP window size: 23.2 KByte (default) '''
__One Client_
{{{
[ ID] Interval Transfer Bandwidth
[ 3] 0.0-60.0 sec 33.6 GBytes 4.82 Gbits/sec
}}}
__Five Clients__
{{{
[ ID] Interval Transfer Bandwidth
[ 7] 0.0-60.0 sec 9.05 GBytes 1.30 Gbits/sec
[ 6] 0.0-60.0 sec 8.85 GBytes 1.27 Gbits/sec
[ 3] 0.0-60.0 sec 9.04 GBytes 1.29 Gbits/sec
[ 4] 0.0-60.0 sec 9.00 GBytes 1.29 Gbits/sec
[ 5] 0.0-60.0 sec 8.94 GBytes 1.28 Gbits/sec
[SUM] 0.0-60.0 sec 44.9 GBytes 6.42 Gbits/sec
}}}
__Ten Clients__
{{{
[ ID] Interval Transfer Bandwidth
[ 8] 0.0-60.0 sec 4.68 GBytes 671 Mbits/sec
[ 7] 0.0-60.0 sec 4.64 GBytes 664 Mbits/sec
[ 5] 0.0-60.0 sec 4.61 GBytes 659 Mbits/sec
[ 9] 0.0-60.0 sec 4.60 GBytes 659 Mbits/sec
[ 6] 0.0-60.0 sec 4.64 GBytes 664 Mbits/sec
[ 10] 0.0-60.0 sec 4.63 GBytes 663 Mbits/sec
[ 12] 0.0-60.0 sec 4.63 GBytes 663 Mbits/sec
[ 11] 0.0-60.0 sec 4.63 GBytes 662 Mbits/sec
[ 4] 0.0-60.0 sec 4.68 GBytes 670 Mbits/sec
[ 3] 0.0-60.0 sec 4.59 GBytes 657 Mbits/sec
[SUM] 0.0-60.0 sec 46.3 GBytes 6.63 Gbits/sec
}}}
''Collected: 2014-03-31''
'''Iperf ExoGENI BM-1 to the VM-1 (UDP) - UDP buffer size: 224 KByte (default) '''
{{{
[ ID] Interval Transfer Bandwidth
[ 3] 0.0-60.0 sec 21.5 MBytes 3.00 Mbits/sec
[ 3] Sent 15308 datagrams
[ 3] Server Report:
[ 3] 0.0-60.9 sec 21.5 MBytes 2.96 Mbits/sec 1.884 ms 626/15308 (4.1%)
[ 3] 0.0-60.9 sec 626 datagrams received out-of-order
}}}
'''Ping from ExoGENI BM-1 the VM-1 '''
{{{
60 packets transmitted, 60 received, 0% packet loss, time 58999ms
rtt min/avg/max/mdev = 0.179/0.284/0.434/0.056 ms
}}}
''Collected: 2013-12-02''
'''Iperf ExoGENI VM-1 to BM-1 (TCP) - TCP window size: 16.0 KB '''
__One Client_
{{{
[ ID] Interval Transfer Bandwidth
[ 3] 0.0-60.0 sec 45.5 GBytes 6.51 Gbits/sec
}}}
__Five Clients__
{{{
[ ID] Interval Transfer Bandwidth
[ 4] 0.0-60.0 sec 9.14 GBytes 1.31 Gbits/sec
[ 5] 0.0-60.0 sec 9.00 GBytes 1.29 Gbits/sec
[ 6] 0.0-60.0 sec 9.44 GBytes 1.35 Gbits/sec
[ 3] 0.0-60.0 sec 9.15 GBytes 1.31 Gbits/sec
[ 7] 0.0-60.0 sec 9.46 GBytes 1.35 Gbits/sec
[SUM] 0.0-60.0 sec 46.2 GBytes 6.61 Gbits/sec
}}}
__Ten Clients__
{{{
[ ID] Interval Transfer Bandwidth
[ 5] 0.0-60.0 sec 5.24 GBytes 750 Mbits/sec
[ 6] 0.0-60.0 sec 5.14 GBytes 736 Mbits/sec
[ 7] 0.0-60.0 sec 5.35 GBytes 765 Mbits/sec
[ 3] 0.0-60.0 sec 5.05 GBytes 722 Mbits/sec
[ 8] 0.0-60.0 sec 5.23 GBytes 749 Mbits/sec
[ 9] 0.0-60.0 sec 5.14 GBytes 736 Mbits/sec
[ 10] 0.0-60.0 sec 5.30 GBytes 758 Mbits/sec
[ 11] 0.0-60.0 sec 5.29 GBytes 757 Mbits/sec
[ 4] 0.0-60.0 sec 5.22 GBytes 747 Mbits/sec
[ 12] 0.0-60.0 sec 5.32 GBytes 761 Mbits/sec
[SUM] 0.0-60.0 sec 52.3 GBytes 7.48 Gbits/sec
}}}
''Collected: 2014-03-31''
'''Iperf ExoGENI VM-1 to BM-1 (UDP) - UDP buffer size: 122 KByte (default) '''
{{{
[ ID] Interval Transfer Bandwidth
[ 3] 0.0-60.0 sec 21.5 MBytes 3.00 Mbits/sec
[ 3] Sent 15308 datagrams
[ 3] Server Report:
[ 3] 0.0-61.1 sec 21.3 MBytes 2.92 Mbits/sec 2.663 ms 120/15307 (0.78%)
[ 3] 0.0-61.1 sec 325 datagrams received out-of-order
}}}
'''Ping from ExoGENI VM-1 to BM-1 '''
{{{
60 packets transmitted, 60 received, 0% packet loss, time 58999ms
rtt min/avg/max/mdev = 0.201/0.310/0.426/0.057 ms
}}}
Note: Re-ran UDP Test on 04/18/2014 and results were: 810 Mbits/sec
== EG-CT-3 - Multiple sites experiment ==
The GPO and UFL racks are used in this experiment. First create a slice:
{{{
$ omni.py createslice EG-CT-3-ufl
10:10:08 INFO omni: Loading agg_nick_cache file '/home/lnevers/.gcf/agg_nick_cache'
10:10:08 INFO omni: Loading config file /home/lnevers/.gcf/omni_config
10:10:08 INFO omni: Setting option useSliceMembers based on omni_config setting
10:10:08 INFO omni: Using control framework portal
10:10:08 INFO omni: Member Authority is https://ch.geni.net/MA (from config)
10:10:08 INFO omni: Slice Authority is https://ch.geni.net/SA (from config)
10:10:08 INFO omni: Slice EG-CT-3-ufl already existed - returning existing slice
10:10:08 INFO omni: Created slice with Name EG-CT-3-ufl, URN urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-3-ufl, Expiration 2014-04-03 13:29:14
10:10:08 INFO omni: ------------------------------------------------------
10:10:08 INFO omni: Completed createslice:
Args: createslice EG-CT-3-ufl
Result Summary: Created slice with Name EG-CT-3-ufl, URN urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-3-ufl, Expiration 2014-04-03 13:29:14
10:10:08 INFO omni: ======================================================
}}}
Then create a sliver via the ExoSM that includes both GPO and UFL VMs using the RSpec the RSpec [http://groups.geni.net/geni/browser/trunk/GENIRacks/ExoGENI/Spiral5/RSpecs/ConfirmationTests/UFL/EG-CT-3-ufl.rspec EG-CT-3-ufl.rspec]
{{{
$ omni.py createsliver EG-CT-3-ufl -a eg-sm ./EG-CT-3-ufl.rspec
10:12:22 INFO omni: Loading agg_nick_cache file '/home/lnevers/.gcf/agg_nick_cache'
10:12:22 INFO omni: Loading config file /home/lnevers/.gcf/omni_config
10:12:22 INFO omni: Setting option useSliceMembers based on omni_config setting
10:12:22 INFO omni: Using control framework portal
10:12:22 INFO omni: Member Authority is https://ch.geni.net/MA (from config)
10:12:22 INFO omni: Slice Authority is https://ch.geni.net/SA (from config)
10:12:22 INFO omni: Substituting AM nickname eg-sm with URL https://geni.renci.org:11443/orca/xmlrpc, URN urn:publicid:IDN+exogeni.net+authority+am
10:12:22 INFO omni: Slice urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-3-ufl expires on 2014-04-03 13:29:14 UTC
10:12:22 INFO omni: Creating sliver(s) from rspec file ./EG-CT-3-ufl.rspec for slice urn:publicid:IDN+ch.geni.net:ln-prj+slice+EG-CT-3-ufl
10:12:57 INFO omni: Got return from CreateSliver for slice EG-CT-3-ufl at eg-sm:
10:12:57 INFO omni:
10:12:57 INFO omni:
10:12:57 INFO omni:
1010000000000l2scethernet1-40961-4096falsehop21010000000000l2scethernet1-40961-4096falsehop31010000000000l2scethernet1-40961-4096falsenull
10:12:57 INFO omni: ------------------------------------------------------
10:12:57 INFO omni: Completed createsliver:
Args: createsliver EG-CT-3-ufl ./EG-CT-3-ufl.rspec
Result Summary: Got Reserved resources RSpec from exogeni-net
10:12:57 INFO omni: ======================================================
}}}
Determine login information for allocated nodes:
{{{
$ readyToLogin.py EG-CT-3-ufl -a eg-sm
...
gpo's geni_status is: ready (am_status:ready)
User lnevers logs in to gpo using:
ssh -i /home/lnevers/.ssh/geni_cert_portal_key lnevers@192.1.242.47
ufl's geni_status is: ready (am_status:ready)
User lnevers logs in to ufl using:
ssh -i /home/lnevers/.ssh/geni_cert_portal_key lnevers@128.227.10.30
}}}
=== Measurements ===
''Collected: 2014-03-27''
'''Iperf ExoGENI GPO VM to UFL VM (TCP) - TCP window size: 16.0 KB '''
__One Client_
{{{
[ ID] Interval Transfer Bandwidth
[ 3] 0.0-60.0 sec 196 MBytes 27.5 Mbits/sec
}}}
__Five Clients__
{{{
[ ID] Interval Transfer Bandwidth
[ 4] 0.0-60.0 sec 199 MBytes 27.9 Mbits/sec
[ 5] 0.0-60.0 sec 198 MBytes 27.6 Mbits/sec
[ 3] 0.0-60.0 sec 199 MBytes 27.8 Mbits/sec
[ 7] 0.0-60.0 sec 199 MBytes 27.8 Mbits/sec
[ 6] 0.0-60.0 sec 198 MBytes 27.7 Mbits/sec
[SUM] 0.0-60.0 sec 993 MBytes 139 Mbits/sec
}}}
__Ten Clients__
{{{
[ ID] Interval Transfer Bandwidth
[ 3] 0.0-60.0 sec 199 MBytes 27.8 Mbits/sec
[ 8] 0.0-60.0 sec 198 MBytes 27.7 Mbits/sec
[ 4] 0.0-60.0 sec 200 MBytes 27.9 Mbits/sec
[ 12] 0.0-60.0 sec 199 MBytes 27.8 Mbits/sec
[ 5] 0.0-60.0 sec 199 MBytes 27.8 Mbits/sec
[ 9] 0.0-60.0 sec 199 MBytes 27.8 Mbits/sec
[ 10] 0.0-60.0 sec 196 MBytes 27.4 Mbits/sec
[ 11] 0.0-60.0 sec 199 MBytes 27.8 Mbits/sec
[ 6] 0.0-60.0 sec 199 MBytes 27.8 Mbits/sec
[ 7] 0.0-60.0 sec 199 MBytes 27.8 Mbits/sec
[SUM] 0.0-60.0 sec 1.94 GBytes 278 Mbits/sec
}}}
''Collected: 2014-03-31''
'''Iperf ExoGENI GPO VM to UFL VM (UDP) - 1470 byte datagrams & UDP buffer size: 122 KByte '''
{{{
[ ID] Interval Transfer Bandwidth
[ 3] 0.0-60.0 sec 28.6 MBytes 4.00 Mbits/sec
[ 3] Sent 20410 datagrams
[ 3] Server Report:
[ 3] 0.0-60.0 sec 21.1 MBytes 2.95 Mbits/sec 0.044 ms 5371/20410 (26%)
[ 3] 0.0-60.0 sec 388 datagrams received out-of-order
}}}
'''Ping from GPO VM to UFL VM '''
{{{
60 packets transmitted, 60 received, 0% packet loss, time 59089ms
rtt min/avg/max/mdev = 34.190/34.412/34.814/0.202 ms
}}}
''Collected: 2014-03-27''
'''Iperf ExoGENI UFL VM to GPO VM (TCP) - TCP window size: 16.0 KB '''
'''Iperf (TCP) - TCP window size: 16.0 KB '''
__One Client_
{{{
[ ID] Interval Transfer Bandwidth
[ 3] 0.0-60.0 sec 196 MBytes 27.5 Mbits/sec
}}}
__Five Clients__
{{{
[ ID] Interval Transfer Bandwidth
[ 7] 0.0-60.0 sec 181 MBytes 25.3 Mbits/sec
[ 6] 0.0-60.0 sec 185 MBytes 25.8 Mbits/sec
[ 4] 0.0-60.0 sec 173 MBytes 24.1 Mbits/sec
[ 3] 0.0-60.0 sec 179 MBytes 25.1 Mbits/sec
[ 5] 0.0-60.0 sec 202 MBytes 28.2 Mbits/sec
[SUM] 0.0-60.0 sec 920 MBytes 129 Mbits/sec
}}}
__Ten Clients__
{{{
[ ID] Interval Transfer Bandwidth
[ 4] 0.0-60.0 sec 111 MBytes 15.6 Mbits/sec
[ 7] 0.0-60.0 sec 128 MBytes 17.9 Mbits/sec
[ 8] 0.0-60.0 sec 199 MBytes 27.8 Mbits/sec
[ 3] 0.0-60.0 sec 197 MBytes 27.5 Mbits/sec
[ 5] 0.0-60.0 sec 109 MBytes 15.3 Mbits/sec
[ 9] 0.0-60.0 sec 111 MBytes 15.5 Mbits/sec
[ 12] 0.0-60.0 sec 197 MBytes 27.5 Mbits/sec
[ 6] 0.0-60.0 sec 117 MBytes 16.4 Mbits/sec
[ 10] 0.0-60.0 sec 122 MBytes 17.0 Mbits/sec
[ 11] 0.0-60.1 sec 148 MBytes 20.7 Mbits/sec
[SUM] 0.0-60.1 sec 1.40 GBytes 201 Mbits/sec
}}}
''Collected: 2014-03-31''
'''Iperf ExoGENI UFL VM to GPO VM (UDP) - 1470 byte datagrams & UDP buffer size: 122 KByte '''
{{{
[ ID] Interval Transfer Bandwidth
[ 3] 0.0-60.0 sec 28.6 MBytes 4.00 Mbits/sec
[ 3] Sent 20410 datagrams
[ 3] Server Report:
[ 3] 0.0-62.1 sec 23.3 MBytes 3.15 Mbits/sec 2.771 ms 3757/20409 (18%)
[ 3] 0.0-62.1 sec 601 datagrams received out-of-order
}}}
'''Ping from GPO VM to UFL VM '''
{{{
60 packets transmitted, 60 received, 0% packet loss, time 59086ms
rtt min/avg/max/mdev = 34.218/34.403/34.696/0.214 ms
}}}
Note: Re-ran UDP Test on 04/18/2014 and results were: 801 Mbits/sec
== EG-CT-4 - Multiple sites !OpenFlow experiment and interoperability ==
No meso-scale available, see UFL Stitching !OpenFlow test [wiki:GeniNetworkStitchingConfirmationTestStatus/UFL#IG-ST-6NewSiteOpenFlowtopology IG-ST-6] status.
== EG-CT-5 - Experiment Monitoring ==
Reviewed content of the GMOC Monitoring page for [https://gmoc-db.grnoc.iu.edu/protected-openid/index.pl?method=aggregates aggregates] and found UFL ExoGENI rack [https://gmoc-db.grnoc.iu.edu/protected-openid/index.pl?method=aggregates&search=ufl aggregates], but no data could be viewed within the GMOC GUI.
[[Image(EG-CT-5-ufl-aggregate.jpg)]]
Verified that the UFL compute resources aggregate shows up in the list of aggregates and provides Aggregate Name, Type, Last Update, Version, POP, and Organization:
[[Image(EG-CT-5-ufl-aggregate-detail.jpg)]]
Active slivers:
[[Image(EG-CT-5-ufl-sliver.jpg, 50%)]]
Not reported at this timpe
List of resources:
[[Image(EG-CT-5-ufl-resources.jpg)]]
Aggregate measurements:
[[Image(EG-CT-5-ufl-measure.jpg)]]
!OpenFlow FOAM Aggregate:
[[Image(EG-CT-5-ufl-foam.jpg)]]
!OpenFlow Slivers:
[[Image(EG-CT-5-ufl-OFsliver.jpg)]]
FOAM aggregate resources:
[[Image(EG-CT-5-ufl-OFresources.jpg)]]
FOAM Aggregate measurements:
[[Image(EG-CT-5-ufl-OFmeasure.jpg)]]
FOAM Sliver Statistics:
[[Image(EG-CT-5-ufl-OFmeasureSliver.jpg)]]
== EG-CT-6 - Administrative Tests ==
Administrator accounts on an ExoGENI rack are documented at https://wiki.exogeni.net/doku.php?id=public:operators:start, with https://wiki.exogeni.net/doku.php?id=public:operators:start#authentication_authorization proving insight into accounts creation and usage.
With requested account accessed rack head node and verified root access and group membership:
{{{
LNM:~$ ssh ufl-hn.exogeni.net
lnevers@ufl-hn.exogeni.net's password:
Last login: Wed Jul 16 16:57:06 2014 from 128.89.254.124
|-----------------------------------------------------------------|
| ____ ____ ____ ____ ____ ____ ____ |
| ||E |||x |||o |||G |||E |||N |||I || |
| ||__|||__|||__|||__|||__|||__|||__|| |
| |/__\|/__\|/__\|/__\|/__\|/__\|/__\| |
| |
|-----------------------------------------------------------------|
[lnevers@ufl-hn ~]$ sudo whoami
[sudo] password for lnevers:
root
[lnevers@ufl-hn ~]$ id
uid=2107(lnevers) gid=2000(nonrenci) groups=2000(nonrenci),2500(fiuadmins),2501(uhadmins),2507(ufladmins),9510(bbnadmins)
[lnevers@ufl-hn ~]$
}}}
From head node verified login and administrative access to each of the worker nodes that supply VMs.
For each worker node execute the following:
{{{
[lnevers@ufl-hn ~]$ for i in 1 2 3 4 5 6 7 8 9 10; do ssh -t ufl-w$i "sudo whoami; uname -r"; done
lnevers@ufl-w1's password:
Could not chdir to home directory /home/lnevers: No such file or directory
We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:
#1) Respect the privacy of others.
#2) Think before you type.
#3) With great power comes great responsibility.
[sudo] password for lnevers:
root
2.6.32-358.el6.x86_64
Connection to ufl-w1 closed.
lnevers@ufl-w2's password:
Could not chdir to home directory /home/lnevers: No such file or directory
We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:
#1) Respect the privacy of others.
#2) Think before you type.
#3) With great power comes great responsibility.
[sudo] password for lnevers:
root
2.6.32-358.el6.x86_64
Connection to ufl-w2 closed.
lnevers@ufl-w3's password:
Could not chdir to home directory /home/lnevers: No such file or directory
We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:
#1) Respect the privacy of others.
#2) Think before you type.
#3) With great power comes great responsibility.
[sudo] password for lnevers:
root
2.6.32-358.el6.x86_64
Connection to ufl-w3 closed.
lnevers@ufl-w4's password:
Could not chdir to home directory /home/lnevers: No such file or directory
We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:
#1) Respect the privacy of others.
#2) Think before you type.
#3) With great power comes great responsibility.
[sudo] password for lnevers:
root
2.6.32-358.el6.x86_64
Connection to ufl-w4 closed.
lnevers@ufl-w5's password:
Could not chdir to home directory /home/lnevers: No such file or directory
We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:
#1) Respect the privacy of others.
#2) Think before you type.
#3) With great power comes great responsibility.
[sudo] password for lnevers:
root
2.6.32-358.el6.x86_64
Connection to ufl-w5 closed.
lnevers@ufl-w6's password:
Could not chdir to home directory /home/lnevers: No such file or directory
We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:
#1) Respect the privacy of others.
#2) Think before you type.
#3) With great power comes great responsibility.
[sudo] password for lnevers:
root
2.6.32-358.el6.x86_64
Connection to ufl-w6 closed.
lnevers@ufl-w7's password:
Could not chdir to home directory /home/lnevers: No such file or directory
We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:
#1) Respect the privacy of others.
#2) Think before you type.
#3) With great power comes great responsibility.
[sudo] password for lnevers:
root
2.6.32-358.el6.x86_64
Connection to ufl-w7 closed.
lnevers@ufl-w8's password:
Could not chdir to home directory /home/lnevers: No such file or directory
We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:
#1) Respect the privacy of others.
#2) Think before you type.
#3) With great power comes great responsibility.
[sudo] password for lnevers:
root
2.6.32-358.el6.x86_64
Connection to ufl-w8 closed.
ssh: connect to host ufl-w9 port 22: No route to host
ssh: connect to host ufl-w10 port 22: No route to host
}}}
Connect to the management switch:
{{{
[lnevers@ufl-hn ~]$ ssh ufl-8052.ufl.xo
Enter radius password:
IBM Networking Operating System RackSwitch G8052.
ufl-8052.ufl.xo>ena
Enable privilege granted.
ufl-8052.ufl.xo#show version
System Information at 15:39:08 Sun Jul 13, 2014
Time zone: America/US/Eastern
Daylight Savings Time Status: Enabled
IBM Networking Operating System RackSwitch G8052
Switch has been up for 290 days, 19 hours, 18 minutes and 39 seconds.
Last boot: 18:01:56 Sat Sep 28, 2013 (power cycle)
MAC address: 74:99:75:cf:c9:00 IP (If 1) address: 192.168.49.50
Hardware Revision: 0
Board Revision: 2
Switch Serial No: Y010CM34F053
Hardware Part No: BAC-00069-00 Spare Part No: BAC-00069-00
Manufacturing date: 13/15
MTM: 7309HC1
ESN: MM08358
Software Version 7.2.2.0 (FLASH image2), active configuration.
Temperature Top: 31 C
Temperature Bottom: 38 C
Temperature Fan Ctrl 88: 34 C
Temperature Fan Ctrl 8a: 67 C
Temperature Fan Ctrl 8c: 67 C
Temperature Phy 0x01: 71 C
Temperature Phy 0x09: 68 C
Temperature Phy 0x11: 73 C
Temperature Phy 0x21: 64 C
Temperature Phy 0x29: 71 C
Temperature Phy 0x31: 65 C
Warning at 55 C and Recover at 80 C
Fan 1 in Module 1: Not Installed
Fan 2 in Module 1: Not Installed
Fan 3 in Module 2: RPM= 7736 PWM= 25 ( 9%) Back-To-Front [J]
Fan 4 in Module 2: RPM= 3176 PWM= 25 ( 9%) Back-To-Front [J]
Fan 5 in Module 3: RPM= 7356 PWM= 25 ( 9%) Back-To-Front [J]
Fan 6 in Module 3: RPM= 3167 PWM= 25 ( 9%) Back-To-Front [J]
Fan 7 in Module 4: RPM= 7356 PWM= 25 ( 9%) Back-To-Front [J]
Fan 8 in Module 4: RPM= 3337 PWM= 25 ( 9%) Back-To-Front [J]
System Fan Airflow: Back-To-Front
Power Supply 1: OK
Power Supply 2: OK
Power Faults: ()
Fan Faults: ()
Service Faults: ()
ufl-8052.ufl.xo#
ufl-8052.ufl.xo#show interface status
------------------------------------------------------------------
Alias Port Speed Duplex Flow Ctrl Link Name
------- ---- ----- -------- --TX-----RX-- ------ ------
1 1 1000 full no no up 1
2 2 1000 full no no up 2
3 3 1000 full no no up 3
4 4 1000 full no no up 4
5 5 1000 full no no up 5
6 6 1000 full no no up 6
7 7 1000 full no no up 7
8 8 1000 full no no up 8
9 9 1000 full no no up 9
10 10 1000 full no no up 10
11 11 1000 full no no up 11
12 12 1000 full no no up 12
13 13 1000 full no no up 13
14 14 1000 full no no up 14
15 15 1000 full no no up 15
16 16 100 half no no up 16
17 17 1000 full no no up Headnode eth4
18 18 1000 full no no up Headnode eth5
19 19 1000 full no no up Headnode eth1
20 20 1000 full no no up Headnode eth0
21 21 1000 full no no up 21
22 22 1000 full no no up 22
23 23 1000 full no no up 23
24 24 1000 full no no up 24
25 25 1000 full no no up 25
26 26 1000 full no no up 26
27 27 1000 full no no up 27
28 28 1000 full no no up 28
29 29 1000 full no no up 29
30 30 1000 full no no up 30
31 31 any any no no down 31
32 32 any any no no down 32
33 33 1000 full no no up 33
34 34 any any no no down 34
35 35 1000 full no no up 35
36 36 any any no no down 36
37 37 1000 full no no up 37
38 38 1000 full no no up 38
39 39 1000 full no no up 39
40 40 1000 full no no up 40
41 41 1000 full no no up 41
42 42 1000 full no no up 42
43 43 1000 full no no up 43
44 44 1000 full no no up 44
45 45 any any no no down 45
46 46 100 full no no up 46
47 47 1000 full no no up 47
48 48 100 full no no up 48
XGE1 49 10000 full no no down XGE1
XGE2 50 10000 full no no down XGE2
XGE3 51 10000 full no no down XGE3
XGE4 52 10000 full no no up XGE4
ufl-8052.ufl.xo# show running-config
Current configuration:
!
version "7.2.2"
switch-type "IBM Networking Operating System RackSwitch G8052"
!
system timezone 145
! America/US/Eastern
system daylight
!
ssh enable
!
access https enable
!
errdisable recovery
no system dhcp
hostname "ufl-8052.ufl.xo"
!
!
no access http enable
no access telnet enable
access user administrator-password "xxx"
!
...many lines of output not shown.....
"
}}}
Connect to the !OpenFlow switch:
{{{
[lnevers@ufl-hn ~]$ ssh ufl-8264.ufl.xo
Enter radius password:
IBM Networking Operating System RackSwitch G8264.
ufl-8264.ufl.xo>show version
System Information at 13:42:30 Wed Jul 16, 2014
Time zone: America/US/Eastern
Daylight Savings Time Status: Enabled
IBM Networking Operating System RackSwitch G8264
Switch has been up for 287 days, 21 hours, 14 minutes and 25 seconds.
Last boot: 11:30:58 Tue Oct 1, 2013 (power cycle)
MAC address: 74:99:75:d7:69:00 IP (If 126) address: 192.168.110.4
Management Port MAC Address: 74:99:75:d7:69:fe
Management Port IP Address (if 128): 0.0.0.0
Hardware Revision: 0
Hardware Part No: BAC-00065-00
Switch Serial No: Y010CM352232
Manufacturing date: 13/18
MTM Value: 7309-HC3
ESN: 23A8693
Software Version 7.6.1.0 (FLASH image1), active configuration.
Temperature Mother Top: 38 C
Temperature Mother Bottom: 29 C
Temperature Daughter Top: 35 C
Temperature Daughter Bottom: 27 C
Warning at 75 C and Recover at 90 C
Fan 1 in Module 1: RPM= 9800 PWM= 15( 5%) Back-To-Front
Fan 2 in Module 1: RPM= 3157 PWM= 15( 5%) Back-To-Front
Fan 3 in Module 2: RPM= 7438 PWM= 15( 5%) Back-To-Front
Fan 4 in Module 2: RPM= 4703 PWM= 15( 5%) Back-To-Front
Fan 5 in Module 3: RPM= 7438 PWM= 15( 5%) Back-To-Front
Fan 6 in Module 3: RPM= 3202 PWM= 15( 5%) Back-To-Front
Fan 7 in Module 4: RPM= 7267 PWM= 15( 5%) Back-To-Front
Fan 8 in Module 4: RPM= 3229 PWM= 15( 5%) Back-To-Front
System Fan Airflow: Back-To-Front
Power Supply 1: OK
Power Supply 2: OK
Power Faults: ()
Fan Faults: ()
Service Faults: ()
ufl-8264.ufl.xo>
ufl-8264.ufl.xo>show interface status
-----------------------------------------------------------------------
Alias Port Speed Duplex Flow Ctrl Link Description
------- ---- ----- -------- --TX-----RX-- ------ -------------
1 1 40000 full no no down 1
5 5 40000 full no no down 5
9 9 40000 full no no down 9
13 13 40000 full no no down 13
17 17 10000 full no no up 17
18 18 10000 full no no up 18
19 19 10000 full no no up 19
20 20 10000 full no no up 20
21 21 10000 full no no up 21
22 22 10000 full no no up 22
23 23 10000 full no no up 23
24 24 10000 full no no up 24
25 25 10000 full no no down 25
26 26 10000 full no no down 26
27 27 1G/10G full no no down 27
28 28 1G/10G full no no down 28
29 29 1G/10G full no no down 29
30 30 1G/10G full no no down 30
31 31 1G/10G full no no down 31
32 32 1G/10G full no no down 32
33 33 1G/10G full no no down 33
34 34 1G/10G full no no down 34
35 35 1000 full no no up iSCSI storage/date plane
36 36 1000 full no no up iSCSI storage/date plane
37 37 10000 full no no up 37
38 38 10000 full no no up 38
39 39 1G/10G full no no down 39
40 40 1G/10G full no no down 40
41 41 10000 full no no up 41
42 42 10000 full no no up 42
43 43 10000 full no no up 43
44 44 10000 full no no up 44
45 45 10000 full no no up 45
46 46 10000 full no no up 46
47 47 10000 full no no up 47
48 48 10000 full no no up 48
49 49 10000 full no no down 49
50 50 10000 full no no down 50
51 51 1G/10G full no no down 51
52 52 1G/10G full no no down 52
53 53 1G/10G full no no down 53
54 54 1G/10G full no no down 54
55 55 1G/10G full no no down 55
56 56 1G/10G full no no down 56
57 57 1G/10G full no no down 57
58 58 1G/10G full no no down 58
59 59 1G/10G full no no down 59
60 60 1G/10G full no no down 60
61 61 1G/10G full no no down 61
62 62 1G/10G full no no down 62
63 63 10000 full no no up 63
64 64 10000 full no no up 64
MGT 65 1000 full yes yes up MGT
ufl-8264.ufl.xo>
ufl-8264.ufl.xo>show openflow
Protocol Version: 1
Openflow State: Enabled
FDB Table Priority: 1000
FDB Table FDB-timeout: Disabled
Openflow Instance ID: 1
state: enabled , buffering: enabled
retry 4, emergency time-out 30
echo req interval 30, echo reply time-out 15
min-flow-timeout : use controller provided values.
max flows acl : Maximum Available
max flows unicast fdb : Maximum Available
max flows multicast fdb : Maximum Available
emergency feature: disabled
dpid: 0x0001749975d76900
ports : 41-60,64
Controller Id: 1
Active Controller
IP Address: 192.168.110.10, port: 6633, Data-Port
Openflow instance 2 is currently disabled
Openflow instance 3 is currently disabled
Openflow instance 4 is currently disabled
Openflow Edge ports : None
Openflow Management ports : None
ufl-8264.ufl.xo#show running-config
Current configuration:
!
version "7.6.1"
switch-type "IBM Networking Operating System RackSwitch G8264"
iscli-new
!
system timezone 145
! America/US/Eastern
system daylight
!
ssh enable
!
!
openflow enable
!
no system bootp
no system dhcp
no system default-ip
hostname "ufl-8264.ufl.xo"
!
!
access snmp read-only
no access http enable
no access telnet enable
access user administrator-password "xxx"
!
!
interface port 17
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 18
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 19
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 20
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 21
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 22
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 23
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 24
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 25
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 26
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 27
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 28
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 29
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 30
switchport mode trunk
switchport trunk allowed vlan 1009,4000
switchport trunk native vlan 4000
spanning-tree portfast
exit
!
interface port 35
description "iSCSI storage/date plane"
switchport access vlan 1009
spanning-tree portfast
exit
!
interface port 36
description "iSCSI storage/date plane"
switchport access vlan 1009
spanning-tree portfast
exit
!
interface port 41
no learning
flood-blocking
exit
!
....many lines of ouput not shown.....
}}}
Verify FOAM and !FlowVisor configuration files ownership and paths:
{{{
[lnevers@ufl-hn ~]$ ls -l /etc/foam.passwd /etc/flowvisor.passwd /etc/flowvisor/fvpasswd /opt/foam/etc/foampasswd
lrwxrwxrwx 1 root flowvisor 21 Sep 26 2013 /etc/flowvisor/fvpasswd -> /etc/flowvisor.passwd
-r--r----- 1 flowvisor ufladmins 25 Sep 26 2013 /etc/flowvisor.passwd
-r--r----- 1 root ufladmins 25 Sep 26 2013 /etc/foam.passwd
lrwxrwxrwx 1 root root 16 Sep 26 2013 /opt/foam/etc/foampasswd -> /etc/foam.passwd
[lnevers@ufl-hn ~]$
}}}
Check FOAM version and FOAM configuration for site.admin.email, geni.site-tag, email.from settings on the UFL head node:
{{{
[lnevers@ufl-hn ~]$ foamctl admin:get-version --passwd-file=/etc/foam.passwd
{
"version": "0.12.3"
}
[lnevers@ufl-hn ~]$
[lnevers@ufl-hn ~]$ foamctl config:get-value --key="site.admin.email" --passwd-file=/opt/foam/etc/foampasswd
{
"value": null
}
[lnevers@ufl-hn ~]$
[lnevers@ufl-hn ~]$ foamctl config:get-value --key="geni.site-tag" --passwd-file=/opt/foam/etc/foampasswd
{
"value": "ufl-hn.exogeni.net"
}
[lnevers@ufl-hn ~]$
[lnevers@ufl-hn ~]$ foamctl config:get-value --key="email.from" --passwd-file=/opt/foam/etc/foampasswd
{
"value": "Chris Griffin "
}
[lnevers@ufl-hn ~]$
[lnevers@ufl-hn ~]$ foamctl config:get-value --key="geni.approval.approve-on-creation" --passwd-file=/opt/foam/etc/foampasswd
{
"value": 2
}
[lnevers@ufl-hn ~]$
}}}
Show FOAM slivers details:
{{{
[lnevers@ufl-hn ~]$ foamctl geni:list-slivers --passwd-file=/opt/foam/etc/foampasswd
{
"slivers": [
{
"status": "approved",
"sliver_urn": "urn:publicid:IDN+ch.geni.net:gpoamcanary+slice+sitemon:4fb9e124-2215-47ec-b996-b75a295c433d",
"creation": "2013-11-19 16:44:49.657258+00:00",
"pend_reason": [],
"expiration": "2014-06-05 00:00:00+00:00",
"deleted": "False",
"user": "urn:publicid:IDN+ch.geni.net+user+asydne01",
"slice_urn": "urn:publicid:IDN+ch.geni.net:gpoamcanary+slice+sitemon",
"enabled": true,
"email": "asydney@bbn.com",
"flowvisor_slice": "4fb9e124-2215-47ec-b996-b75a295c433d",
"desc": "sitemon OpenFlow resources at UFL",
"ref": null,
"id": 20,
"uuid": "4fb9e124-2215-47ec-b996-b75a295c433d"
},
{
"status": "approved",
"sliver_urn": "urn:publicid:IDN+ch.geni.net:gpo-infra+slice+gpoI15:0a5193e8-91f3-47a5-9aa8-c5e8955cbf69",
"creation": "2013-11-25 20:40:31.615186+00:00",
"pend_reason": [],
"expiration": "2014-05-15 23:00:00+00:00",
"deleted": "False",
"user": "urn:publicid:IDN+ch.geni.net+user+jbs",
"slice_urn": "urn:publicid:IDN+ch.geni.net:gpo-infra+slice+gpoI15",
"enabled": true,
"email": "jbs@bbn.com",
"flowvisor_slice": "0a5193e8-91f3-47a5-9aa8-c5e8955cbf69",
"desc": "gpoI15 Florida ExoGENI OpenFlow resources.",
"ref": null,
"id": 21,
"uuid": "0a5193e8-91f3-47a5-9aa8-c5e8955cbf69"
},
{
"status": "approved",
"sliver_urn": "urn:publicid:IDN+ch.geni.net:gpo-infra+slice+gpoI16:771948c8-484a-4035-937e-5769861f5a41",
"creation": "2013-11-25 20:41:31.629525+00:00",
"pend_reason": [],
"expiration": "2014-05-15 23:00:00+00:00",
"deleted": "False",
"user": "urn:publicid:IDN+ch.geni.net+user+jbs",
"slice_urn": "urn:publicid:IDN+ch.geni.net:gpo-infra+slice+gpoI16",
"enabled": true,
"email": "jbs@bbn.com",
"flowvisor_slice": "771948c8-484a-4035-937e-5769861f5a41",
"desc": "gpoI16 Florida ExoGENI OpenFlow resources.",
"ref": null,
"id": 22,
"uuid": "771948c8-484a-4035-937e-5769861f5a41"
}
]
}
[lnevers@ufl-hn ~]$
}}}
Check the !FlowVisor version, list of devices, get details for a device, list of active slices, and details for one of the slices on the UFL Head node:
{{{
[lnevers@ufl-hn ~]$ /opt/flowvisor/bin/fvctl --passwd-file=/etc/flowvisor/fvpasswd ping hello
Got reply:
PONG(fvadmin): FV version=flowvisor-0.8.1::hello
[lnevers@ufl-hn ~]$ /opt/flowvisor/bin/fvctl --passwd-file=/etc/flowvisor/fvpasswd listDevices
Device 0: 00:01:74:99:75:d7:69:00
[lnevers@ufl-hn ~]$ /opt/flowvisor/bin/fvctl --passwd-file=/etc/flowvisor/fvpasswd getDeviceInfo 00:01:74:99:75:d7:69:00
nPorts=21
portList=41,42,43,44,45,46,47,48,49,51,52,53,54,55,56,57,58,59,60,64,50
dpid=00:01:74:99:75:d7:69:00
remote=/192.168.110.10:6633-->/192.168.110.4:2909
portNames=41(41),42(42),43(43),44(44),45(45),46(46),47(47),48(48),49(49),51(51),52(52),53(53),54(54),55(55),56(56),57(57),58(58),59(59),60(60),64(64),50(50)
[lnevers@ufl-hn ~]$ /opt/flowvisor/bin/fvctl --passwd-file=/etc/flowvisor/fvpasswd listSlices
Slice 0: 4fb9e124-2215-47ec-b996-b75a295c433d
Slice 1: 0a5193e8-91f3-47a5-9aa8-c5e8955cbf69
Slice 2: 771948c8-484a-4035-937e-5769861f5a41
Slice 3: fvadmin
Slice 4: orca-1411
[lnevers@ufl-hn ~]$ /opt/flowvisor/bin/fvctl --passwd-file=/etc/flowvisor/fvpasswd getSliceInfo 4fb9e124-2215-47ec-b996-b75a295c433d
Got reply:
connection_1=00:01:74:99:75:d7:69:00-->/128.227.10.5:21931-->hafmet.gpolab.bbn.com/192.1.249.178:31750
contact_email=asydney@bbn.com
controller_hostname=hafmet.gpolab.bbn.com
controller_port=31750
creator=fvadmin
}}}
Verified alerts for the compute resource Aggregate Manager and FOAM Aggregate Manager are being reported to the [http://monitor.gpolab.bbn.com/nagios/cgi-bin/status.cgi GPO Tango GENI Nagios monitoring] and that all alerts are know issues
[[Image(UFL-OF-nagios.jpg)]]
----
{{{
#!html
Email help@geni.net for GENI support or email me with feedback on this page!
}}}