HSM System
HSM System
HSM System Overview
The HSM system consists of GHI file system and HPSS, and is intended to provide HSM data domain for the batch servers, work servers and the Grid system.
The GHI file system serves as an interface between GPFS and HPSS and offers an operability similar to regular file systems via POSIX-based API.
Data written to the GHI file system will eventually be moved to HPSS.
It does not provide CIFS service , unlike the GPFS disk domain offered in the Disk Storage System.
For detail of GHI and HPSS, please visit the following page Reference.
GHI File system domain
GHI File System Structure
The GHI file system defines available domains assigned to each work group.
Users can access those domains from the work or batch servers and read and write in the directories listed below.
In the new HSM system, each workgroup and each sub-group has been assigned new domains different from the old ones.
Domain (name of file system) |
workgroup |
directory |
---|---|---|
GHI Domain#1 (/ghi/fs01) |
Belle | /hsm/belle |
Belle2 | /hsm/belle2 | |
GHI Domain#2 (/ghi/fs02) |
T2K | /hsm/t2k |
HAD | /hsm/had | |
MLF | /hsm/mlf | |
ILC | /hsm/ilc | |
CMB | /hsm/cmb | |
Atlas | /hsm/atlas | |
BESS | /hsm/bess | |
Central | /hsm/ce | |
PS | /hsm/ps | |
The OLD data(2000-2005) read-only |
/hsm/old | |
GHI Domain#3 (/ghi/fs03) |
Belle2 | /ghi/fs03/belle2 |
Cooperation between GHI and HPSS
GHI and HPSS cooperate and function with each other. Please be aware of the way the GHI handles data.
- The file(s) on the GHI file system domain exist(s) in GHI disk for the first place, then, copied to the HPSS domain.
- Workgroups are assigned certain GHI space for use below the workgroup's directory.
- Files are copied to the HPSS tape volumes predetermined for every affiliation workgroup.
- ghitapequota command shows the number of tape volumes inserted into HPSS.
- Frequently-used files are held to GHI disk.
- Less frequently-used files and the files smaller than 8MB are migrated to HPSS tapes and purged from a GHI disk.
- If you access files existing only on HPSS tape, the files will be copied to a GHI disk from a HPSS tape by function of GHI stage.
- This function enables end users to use files without being aware of the actual location.
- GHI/HPSS is designed to store large files larger than 256M bytes.
- Please store smaller-size files in Magnetic Disk Storage.
- Please store the files larger than 8GB in case performance matters.
HPSS Features
The data saved in each work group's directory will be stored on HPSS tape media.
HPSS assigns a unique number of the "class of service, COS" and the Family ID for each workgroup and sub-group.
HPSS manages the number of tape media for each workgroup and sub-group with the COS and Family ID.
Supported HPSS tape media
The HPSS utilizes IBM 3592 tapes which has several sub-types.
The specification of each tape type are shown in the table below;
tape spec. | 3592-JE (GEN6) |
3592-JD (GEN5A) |
3592-JD (GEN5) |
3592-JC (GEN5) |
3592-JC (GEN4) |
3592-JB (GEN4) |
3592-JB (GEN3) |
---|---|---|---|---|---|---|---|
used period in KEK site | 2020 - | 2020 - | 2016 - | 2016 - | 2012 - 2016 | 2012 - 2016 | 2009 - 2011 |
non-compressed capacity [GB/vol] |
20000 | 15000 | 10000 | 7000 | 4000 | 1600 | 1000 |
max speed [MB/sec] |
400 | - | - | - | 250 | 200 | 160 |
Utilization of the HPSS tape media
The HPSS manages available type and volumes according to COS and Family ID. See COS list for COS's and Family IDs, their directories and media types.
Also see Current tape quota status or perform ghitapequota2 command for the current number of volumes available for you.
Utilization of GHI file system
The following utility is offered in order to check the utilization of GHI file system.
ghils command
ghils command is effective only on GHI file system. If ghils command is executed on GPFS domain, it will result in an error.
ghils command is similar to the UNIX "ls" command, but adds location information on GHI file system.
The response of ghils takes time somewhat if the specified file exists in HPSS.
<Synopsis>:ghils [-a] [-l] [-n] [-R] [u] <GHI file| GHI directory>
-a -- Include hidden files/directories , such as names which begin with a '.'.
-l -- <ell> Long format, i.e., include UNIX details similar to 'ls -l'.
-n -- Like option '-l' except that UserID and GroupID will be numeric.
-R -- Recursively list sub-directories.
-u -- Produce unsorted listing.
Note that a slash (/) is required at the end when specifying a directory.
$> ghils /ghi/fs02/test/
H /ghi/fs02/test/hpss_ghi_ls.10
B /ghi/fs02/test/hpss_ghi_ls.11
G /ghi/fs02/test/hpss_ghi_ls.12↓
- G: The file exists only on the GHI disk.
- B: The file exists both on the GHI disk and the HPSS.
- BP: The file exists both on the GHI disk and the HPSS. This file is not GHI-purged.
- H: The file exists only on the HPSS and not on the GHI disk.
- HP: The file exists only on the HPSS and not on the GHI disk. This file is not GHI-purged.
ghitapequota2-command
※ The ghitapequota command is also available
ghitapequota2 command shows the usage of tape cartridges owned by each workgroup.
The status output is refreshed hourly.
Tape type is described as follows:
- JC:3592-JC (GEN5)
- JD:3592-JD (GEN5) /3592-JD (GEN5A)
- JE:3592-JE (GEN6)
US/TQ , US/MS is described as follows:
- US/TQ = Used Size / Tape Qota
(TQ=JC TapeQuota × 7TB + JD TapeQuota × 15TB + JE TapeQuota × 20TB) - US/MS = Used Size / Max Size
<Synopsis>:ghitapequota2 [-g group-name]
with -g : print information about the group which is passed by the argument.
without -g : print information about all groups.
$> ghitapequota2
Fri Oct 16 16:10:03 JST 2020
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
Name COS Famly Assign TapeJC TapeJD TapeJE Used Free Max US/TQ US/MS
ID ID Total Qota Used Free Qota Used Free Qota Used Free Size [GB] Size [GB] Size [GB] [%] [%]
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
COS 28 - 12 0 6 6 0 0 0 0 0 0 890 69798 70688 - 1.26
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
COS 29 - 24 0 24 0 0 0 0 0 0 0 170416 8617 179033 - 95.19
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
belle2_fs03 31 31000 - 0 0 - 40 1 - 0 0 - 0 0 0 0.00 -
belle2_fs03_grid_storm 31 31001 - 0 0 - 240 248 - 0 0 - 1690352 0 1690352 70.43 -
belle2_fs03_grid_ops 31 31002 - 1 1 - 0 0 - 0 0 - 0 6999 6999 0.00 -
COS 31 - 281 1 1 0 280 249 31 0 0 0 1690353 471999 2162352 - 78.17
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ce_kagra_grid_storm 51 51001 - 5 1 - 0 0 - 0 0 - 34 0 34 0.10 -
COS 51 - 5 5 1 4 0 0 0 0 0 0 34 28000 28034 - 0.12
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ce_sbrc 52 52000 - 0 0 - 40 6 - 0 0 - 81718 0 81718 20.43 -
COS 52 - 40 0 0 0 40 6 34 0 0 0 81718 510000 591718 - 13.81
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
mlf 61 61000 - 5 2 - 4 2 - 0 0 - 22520 4799 27319 30.03 -
mlf_irods 61 61001 - 30 15 - 76 35 - 0 0 - 494793 8695 503488 51.01 -
mlf_deeme 61 61002 - 0 3 - 5 0 - 0 0 - 14538 6618 21156 29.08 -
COS 61 - 120 35 20 15 85 37 48 0 0 0 531852 845113 1376965 - 38.62
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
cmb 62 62000 - 0 0 - 9 9 - 0 0 - 77876 0 77876 86.53 -
cmb_pb 62 62001 - 0 0 - 30 4 - 0 0 - 22030 0 22030 7.34 -
COS 62 - 39 0 0 0 39 13 26 0 0 0 99907 390000 489907 - 20.39
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ilc_grid 64 64000 - 17 0 - 33 22 - 0 0 - 201677 0 201677 44.92 -
ilc_grid_dpm 64 64001 - 0 5 - 7 0 - 0 0 - 34555 0 34555 49.36 -
COS 64 - 57 17 5 12 40 22 18 0 0 0 236233 354000 590233 - 40.02
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ce_naregi 65 65000 - 0 0 - 2 1 - 0 0 - 1078 0 1078 5.39 -
COS 65 - 2 0 0 0 2 1 1 0 0 0 1078 15000 16078 - 6.70
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
t2k_JB_beam 66 66001 - 5 4 - 14 13 - 0 0 - 130097 0 130097 74.34 -
COS 66 - 19 5 4 1 14 13 1 0 0 0 130097 22000 152097 - 85.54
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
test_ibm_se 67 67000 - 0 0 - 1 1 - 0 0 - 274 0 274 2.74 -
test_ibm_sysimage_bk1 67 67001 - 0 0 - 1 1 - 0 0 - 8194 0 8194 81.94 -
test_ibm_sysimage_bk2 67 67002 - 0 0 - 1 1 - 0 0 - 462 0 462 4.62 -
orhpan_file 67 75000 - 0 0 - 1 1 - 0 0 - 2 0 2 0.02 -
COS 67 - 6 0 0 0 4 6 0 0 0 0 9916 0 9916 - 100.00
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
t2k_beam 68 68001 - 10 2 - 8 4 - 0 0 - 86173 911 87084 57.45 -
t2k_nd280 68 68002 - 15 18 - 8 4 - 0 0 - 118249 26309 144558 63.92 -
t2k_irods 68 68003 - 4 6 - 3 0 - 0 0 - 20598 6913 27511 35.51 -
COS 68 - 48 29 26 3 19 8 11 0 0 0 225020 220135 445155 - 50.55
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
nu_wagasci 69 69001 - 0 0 - 5 7 - 0 0 - 228742 0 228742 457.48 -
nu_ninja 69 69002 - 0 0 - 1 1 - 0 0 - 1815 0 1815 18.15 -
COS 69 - 11 0 0 0 6 8 3 0 0 0 230557 45000 275557 - 83.67
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
had_koto 70 70000 - 0 0 - 263 208 - 0 0 - 2097726 0 2097726 79.76 -
COS 70 - 263 0 0 0 263 208 55 0 0 0 2097726 825000 2922726 - 71.77
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
had 71 71000 - 0 1 - 1 0 - 0 0 - 0 0 0 0.00 -
had_sks 71 71001 - 10 9 - 28 20 - 0 0 - 242663 0 242663 69.33 -
had_knucl 71 71002 - 5 6 - 15 26 - 5 1 - 321183 0 321183 112.70 -
had_trek 71 71003 - 5 3 - 0 0 - 0 0 - 15363 0 15363 43.89 -
had_g-2 71 71004 - 0 0 - 5 5 - 0 0 - 37517 0 37517 75.03 -
had_muon 71 71005 - 0 1 - 15 13 - 0 0 - 114303 0 114303 76.20 -
had_high-p 71 71006 - 0 0 - 4 4 - 0 0 - 22370 0 22370 55.93 -
COS 71 - 93 20 20 0 68 68 0 5 1 4 753402 80000 833402 - 90.40
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ilc 72 72000 - 15 11 - 0 0 - 0 0 - 54539 9972 64511 51.94 -
COS 72 - 15 15 11 4 0 0 0 0 0 0 54539 37972 92511 - 58.95
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ilc_grid_storm 73 73000 - 135 83 - 20 0 - 0 0 - 513092 7365 520457 44.81 -
COS 73 - 155 135 83 52 20 0 20 0 0 0 513092 671365 1184457 - 43.32
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
acc 74 74001 - 2 1 - 0 0 - 0 0 - 5123 0 5123 36.59 -
atlas 74 74002 - 2 1 - 0 0 - 0 0 - 0 0 0 0.00 -
ce 74 74003 - 2 4 - 0 0 - 0 0 - 7100 13269 20369 50.71 -
ce_geant4 74 74004 - 2 2 - 0 0 - 0 0 - 438 6948 7386 3.13 -
COS 74 - 8 8 8 0 0 0 0 0 0 0 12662 20217 32879 - 38.51
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
test 75 75000 - 13 6 - 0 0 - 0 0 - 26300 6999 33299 28.90 -
test_ibm_test_cksum 75 75001 - 5 3 - 0 0 - 0 0 - 14120 0 14120 40.34 -
COS 75 - 18 18 9 9 0 0 0 0 0 0 40420 69999 110419 - 36.61
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ce_lcg_dpm 76 76001 - 2 1 - 0 0 - 0 0 - 4188 0 4188 29.91 -
COS 76 - 2 2 1 1 0 0 0 0 0 0 4188 7000 11188 - 37.43
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ce_irods 77 77000 - 2 1 - 0 0 - 0 0 - 28 0 28 0.20 -
ce_irods_irods01 77 77001 - 2 2 - 0 0 - 0 0 - 157 6999 7156 1.12 -
ce_irods_irods04 77 77004 - 2 1 - 0 0 - 0 0 - 32 0 32 0.23 -
COS 77 - 6 6 4 2 0 0 0 0 0 0 218 20999 21217 - 1.03
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
bess 78 78000 - 6 2 - 0 0 - 0 0 - 13836 0 13836 32.94 -
COS 78 - 6 6 2 4 0 0 0 0 0 0 13836 28000 41836 - 33.07
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ps_klea 79 79000 - 12 10 - 0 0 - 0 0 - 101645 0 101645 121.01 -
had_koto_jc 79 79001 - 712 714 - 0 0 - 0 0 - 4722597 0 4722597 94.76 -
COS 79 - 724 724 724 0 0 0 0 0 0 0 4824243 0 4824243 - 100.00
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
belle_bdata1 81 81001 - 23 23 - 32 29 - 0 0 - 379741 0 379741 78.95 -
COS 81 - 55 23 23 0 32 29 3 0 0 0 379741 45000 424741 - 89.41
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
belle_bhsm 82 82000 - 0 0 - 135 152 - 0 0 - 1248145 74973 1323118 92.46 -
COS 82 - 152 0 0 0 135 152 0 0 0 0 1248145 74973 1323118 - 94.33
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
belle_grid 83 83000 - 0 0 - 2 1 - 0 0 - 0 0 0 0.00 -
belle_grid_dpm 83 83001 - 0 0 - 3 1 - 0 0 - 5444 0 5444 18.15 -
COS 83 - 5 0 0 0 5 2 3 0 0 0 5444 45000 50444 - 10.79
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
belle2_grid 84 84000 - 0 0 - 2 4 - 0 0 - 161 14999 15160 0.80 -
belle2_grid_dpm 84 84001 - 0 0 - 3 1 - 0 0 - 0 0 0 0.00 -
COS 84 - 5 0 0 0 5 5 0 0 0 0 161 14999 15160 - 1.06
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ibm_test_stage_1 87 87000 - 3 2 - 0 0 - 0 0 - 171 6999 7170 0.81 -
ibm_test_stage_1 87 87001 - 1 1 - 0 0 - 0 0 - 0 0 0 0.00 -
COS 87 - 4 4 4 0 0 0 0 0 0 0 177 6999 7176 - 2.47
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ibm_test_stage_2 88 88000 - 2 2 - 0 0 - 0 0 - 803 6999 7802 5.74 -
COS 88 - 2 2 2 0 0 0 0 0 0 0 803 6999 7802 - 10.29
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ibm_test_stage_3 89 89001 - 0 0 - 3 3 - 0 0 - 7 0 7 0.02 -
COS 89 - 3 0 0 0 3 3 0 0 0 0 7 0 7 - 100.00
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
old 90 90000 - 5 3 - 0 0 - 0 0 - 19714 0 19714 56.33 -
COS 90 - 5 5 3 2 0 0 0 0 0 0 19714 14000 33714 - 58.47
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
belle2_bdata 92 92001 - 320 320 - 700 648 - 0 0 - 8736897 0 8736897 94.56 -
COS 92 - 1020 320 320 0 700 648 52 0 0 0 8736897 780000 9516897 - 91.80
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
belle_bfs 93 93000 - 249 173 - 0 0 - 0 0 - 1034855 27973 1062828 59.37 -
belle_bdata2 93 93002 - 320 295 - 0 0 - 0 0 - 935658 25867 961525 41.77 -
COS 93 - 569 569 519 50 0 0 0 0 0 0 2417122 410678 2827800 - 85.48
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
belle_grid_storm 94 94001 - 110 5 - 0 0 - 0 0 - 27591 0 27591 3.58 -
belle2_grid_storm 94 94002 - 168 274 - 276 235 - 0 0 - 2692343 102414 2794757 68.40 -
belle_grid_storm_local_test 94 94003 - 3 1 - 0 1 - 0 0 - 0 0 0 0.00 -
belle2_grid_ops 94 94004 - 1 2 - 0 0 - 0 0 - 0 6999 6999 0.00 -
COS 94 - 558 282 282 0 276 236 40 0 0 0 2719934 709414 3429348 - 79.31
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ce_lcg_storm 95 95000 - 20 7 - 0 0 - 0 0 - 29819 8502 38321 21.30 -
COS 95 - 20 20 7 13 0 0 0 0 0 0 29819 99502 129321 - 23.06
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
$> ghitapequota -g ilc_grid
Fri Oct 16 17:10:02 JST 2020
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
Name COS Famly Assign TapeJC TapeJD TapeJE Used Free Max US/TQ US/MS
ID ID Total Qota Used Free Qota Used Free Qota Used Free Size [GB] Size [GB] Size [GB] [%] [%]
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ilc_grid 64 64000 - 17 0 - 33 22 - 0 0 - 201677 0 201677 44.92 -
ilc_grid_dpm 64 64001 - 0 5 - 7 0 - 0 0 - 34555 0 34555 49.36 -
COS 64 - 57 17 5 12 40 22 18 0 0 0 236233 354000 590233 - 40.02
------------------------------ --- ----- ------ ---- ---- ---- ---- ---- ---- ---- ---- ---- ---------- ---------- ---------- ------ ------
ghitapedrive command
ghitapedrive command shows availability of HPSS tape drives.
It may help to check the status of tape drives when there is a trouble accessing files only on the HPSS domain.
$> ghitapedrive
-------------- ---------- ----------
Year Date Time NumTapeDrv NumFreeDrv
-------------- ---------- ----------
2020.0713.1817 72 72
-------------- ---------- ----------
- NumTapeDrv: Number of all available drives
- NumFreeDrv: Number of empty drives
hstage - Batch processing is used to stage files
This batch processing is used to stage files which have been purged from the GHI file system.
-
How to use the hstage
Create a list of files you want to stage in the /ghi/fs0[1-3]/hstage/requests/ of the file system that contains the files you want to stage.If you want to stage files in /ghi/fs01, create a file list in /ghi/fs01/hstage/request/.
If you want to stage files in /ghi/fs02, create a file list in /ghi/fs02/hstage/request/.
If you want to stage files in /ghi/fs03, create a file list in /ghi/fs03/hstage/request/.** If not specified as above, stage processing will not be performed.
The file list must contains files or directories to be staged with full path name.
The file list file path name must contain /ghi/fs0[1-3]/.
If there are any space in the file paths, hstage will not work well correctly. **
example:
Case of staging files in /ghi/fs01/path/to/
% cat /ghi/fs01/hstage/request/sample.lst
/ghi/fs01/path/to/file.1
/ghi/fs01/path/to/file.2
/ghi/fs01/path/to/file.3
~
/ghi/fs01/path/to/file.10000
* Maximum entries of one file list is 10,000.
* Staging is processed in the order of file creation time.
- The result is output in the /ghi/fs0[1-3]/hstage/results/yyyymmdd directory.
"results."+"data-and-time" of processing is postfixed to the requested filename as a output file.
example:
sample.lst.result.yyyymmdd_hhmmss
* Files will be automatically deleted after one month.
- Also, your request file is moved to the /ghi/fs0[1-3]/hstage/requets/done/yyyymmdd directory.
"data-and-time" of processing is postfixed to the requested filename.
example:
sample.lst.yyyymmdd_hhmmss
* Files will be automatically deleted after one month.
- Check your results
% cat sample.lst.result.yyyymmdd_hhmmss
B /ghi/fs01/path/to/file.1
B /ghi/fs01/path/to/file.2
B /ghi/fs01/path/to/file.3
|
+-- ghils status:
G: The file is GHI resident and has not been migrated to HPSS.
B: The file is dual resident. The data exists in both GHI and HPSS.
H: The file is HPSS resident. The file data has been purged from GHI.
* The file residency indicator will be followed by P if the file is pinned (a blank ' ' if not in pinned-state).
- To find your request files, use the find command etc.
example:
% find /ghi/fs0?/hstage/requests/done/*/ -user username
Changes from the old Common Computing System
Comparison between the HPSS in the old system and the HSM in the new Data Analysis System is described in the following table.
Item | HSM System (New) | HSM System (Old) |
---|---|---|
Software Name Version |
HPSS 10.3.0 update 6 GHI 4.1.0 update 4 |
HPSS 8.3.0 update 22 GHI 3.2.0 update 5 |
Type and number of tape libraries | TS1160, 70 | TS1160, 72 |
Supported tape media and capacity | JC Gen5(7TB) JD Gen5A(15TB) JE Gen6(20TB) |
JC Gen5(7TB) JD Gen5A(15TB) JE Gen6(20TB) |
*** |
TIPS for using HSM system
Staging files
Files you created in HSM filesystem are purged (deleted from a GHI disk) a short time later.
For example, you can see following result "H" by using ghils command against the purged file /ghi/fs01/path/to/file. (H(pss) means the file is only HPSS tape)
$> ghils /ghi/fs01/path/to/file
H /ghi/fs01/path/to/file
If you submit jobs using files such as above condition, you can use CPU resources more effectively by staging these files.
As user prividge, you can stage files by reading more than 1 byte. ("ls" command is not suitable)
We recommend following command:
$> od /ghi/fs01/path/to/file | tail -n 1
After finishing file stage, you can see following result "B" by using ghils command. (B(oth) means the file is between GHI disk and HPSS tape)
$> ghils /ghi/fs01/path/to/file
B /ghi/fs01/path/to/file
If you need to stage more than hundreds of files, please use the hstage utility.
The Size Restriction of the GHI file
The GHI file which size is 0 bytes can be created. But it cannot be migrated to the HPSS area.
The Length Restriction of the full path name of the GHI file
The GHI filesystem restricts the full path name of the GHI file to 1023 bytes or smaller.
The GHI file which full path name is longer than 1024 bytes can be created. But it cannot be migrated to the HPSS area. In order to migrate the GHI file to the HPSS area, it is mandatory to make the GHI file which full path name is smaller or equal to 1023 bytes.
The length of the full path name is counted based on the real file name, starting with "/ghi", which is a real GHI file name. Please note that the GHI filesystem can be expressed as symbolic link name. For example, the GHI file can be accessed via the directory name /hsm/belle, but it is the symbolic link to the real directory /ghi/fs01/orig_root_fs01/belle. Please refer to the GHI File System Structure for each directory name.
- The real directory name for the belle and belle2 groups' HSM filesystem is /ghi/fs01/orig_root_fs01.
- The real directory name for other groups' HSM filesysm is /ghi/fs02/orig_root_fs02.
The character Restriction of the path name of the GHI file
ASCII control characters (decimal system 00-31) are not available as GHI file names.
If used, it will be renamed by the system.
The file type Restriction of the HPSS file
The Socket file and Pipe file can be created. But it cannot be migrated to the HPSS area.
Tape usage limits
When there are not enough tape volume(s) remaining, the representative of each workgroup will receive a warning mail. In this case, one of the following will be suggested;
- Deletion of an unnecessary file(s)
- Additional purchase of tape cartridge(s)
Additional References
Introduction of GHI and HPSS
For detail of GHI and HPSS, please visit the following page.
GHI/HPSS Glossary
-
GHI migration
- "GHI migration" means file is copied to HPSS domain from GHI disk.
- A file newly written in GHI file system exists on GHI disk at first, then it is automatically copied to a HPSS disk according to a GHI policy.
-
GHI purge
- "GHI purge" means deleting file from a GHI disk.
- If the usage reaches the preset upper limit of the GHI file system, the least recently used file becomes a candidate for deletion from a GHI disk.
- No GHI purge is executed for never GHI-migrated files.
- GHI-purged file exists only on HPSS domain (disk or tape) until it is GHI-staged.
-
GHI stage
- "GHI stage" means copying file from HPSS domain to GHI disk.
- If GHI-purged file is called, it is automatically copied to a GHI disk from HPSS domain.
-
HPSS migration
- "HPSS migration" means copying file from from HPSS disk cache to HPSS tape media.
- Writing to HPSS disks is called GHI migration.
- A file on the HPSS disk cache is automatically copied to HPSS tape media according to a HPSS policy associated for every COS/FamilyID.
-
HPSS purge
- "HPSS purge" means deleting file from HPSS disk cache.
- If the usage reached the preset upper limit of the HPSS disk cache, the least recently used file becomes a candidate for deletion from HPSS disk cache.
- No HPSS purge is executed for never HPSS-migrated files.
- HPSS-purged file exists only on HPSS tape media not on HPSS disk cache.