Join the Community
and take part in the story

Hitachi Content Platform and OpenIO


#1

Some guys get their kicks with Call of Duty or with PES. I get mine testing stuff which, mostly, I’ve never worked with before.

It’s a blend of fun and need (for work purposes).

I believe I saw some article about OpenIO when I was looking for Object Storage related articles.

At first, it looked a bit out of scope because the S3 connection was based on a Gateway so I let it go but the name kept popping on more feeds so I decided to try it to see if I could use it as a Hitachi Content Platform external repository.

This is the process used.

Install a small virtual environment for OpenIO with 3 data nodes and one gateway (CENTOS vms with 100GB storage and 2GB RAM), as described here.

All settings were let as default. After testing the access with aws commands provided on the example, both locally on the gateway and from another linux client, I moved to what was really the objective: Test if we could use OpenIO as an external repository for an HCP.

For those who don’t know what HCP is, here is some info. It’s a very powerful product but it can also benefit of very large external storage for contents that became cold. For that purpose, it can connect via S3 with a compatible platform.

Please note that I’m not responsible for any kind of mishap that can happen to your data during the operations mentioned hereafter. They have worked for me (twice) and serve solely to demonstrate that OPENIO does work as an HCP backend. Performance, reliability and maximum number of files that can be deposited were not the objective for now.

So, assuming you have a working HCP and a working OPENIO cluster with a S3 gateway, this is what you need to do to get it as

Get the credentials of the OpenIO S3 portal
In my case, I use one of the Linux boxes previously used to test S3 access to OpenIO S3 gateway to issue “keystonerc_demo ; openstack ec2 credentials create”
On the OpenIO S3 gateway, get the username created to access the OpenIO nodes with “swift stat demo”
You will need it later to check if your data is getting to the right place
swift stat demo
Account: AUTH_xxxxx
Container: demo
Objects: 122
Bytes: 93291683
Read ACL:
Write ACL:
Sync To:
Sync Key:
Content-Type: text/plain; charset=utf-8
X-Timestamp: 0000000000.00000
X-Trans-Id: txaecbf40e0bb54ea48bdb8-0058c72c16
Login to your HCP
On Storage, start by adding a Component
Use S3 Compatible for the type of the Component
Give it a meaningful name
Provide the ip address of your OpenIO S3 gateway, uncheck HTTPS and change the HTTP port to 6007
Use the access and the secret key gathered earlier to fill in the respective fields
Demo bucket appears because it’s already present on OpenIO Cluster. Go ahead and create one just for HCP service
The new component now appears on the list.
Proceed to Pools
Create a new Pool
Select S3 Compatible
Give it a name
On allocation, choose the OpenIO Component
A new Pool shows up.
Proceed to Service Plan
Create a new Service Plan
Give it a name
Choose nothing and click Next
Finish it
Add a Tier.
Since all Service Plans start with an Ingestion, to get data to OpenIO, we need to add a second Tier to the newly created Service Plan
To speed thing up, leave 0 days so data will flow immediately to the OpenIO cluster
These options are pretty much self-explanatory
Choose the OpenIO Pool and one copy of data. Understand that all data copies will reside outside HCP so you must care for its health
The Pool now has a second Tier
At this point, you can check that HCP as contacted OpenIO and created its own Bucket openio object list hcp --oio-account AUTH_xxxx --oio-ns OPENIO
+--------+------+-------------------------------------------------------------+
| Name | Size | Hash
+--------+-------+------------------------------------------------------------+
| .hcp | 62 | C6ED2AD82DEA47F2CFDE518E411D6DAE
+--------+------+-------------------------------------------------------------+
The objective of this guide is not to show how to operate HCP’s frontend so we skip that and jump to the end were we can see data uploaded to HCP already residing on the OpenIO data nodes by issuing “openio object list hcp --oio-account AUTH_xxxxx --oio-ns OPENIO”

Next steps will be determining how reliable the backend is and how fast can it go.

This is a very interesting project with a devoted team.
My regards to all but especially to Guillaume how kept me running with his quick Slack replies.

Kind regards.

Joao


#2

Hey @jserra!

Thanks a lot for this awesome installation guide!

For those who will follow it, would mind sharing the version of HCP your are using?
Assuming that you are running OpenIO 16.10.

Guillaume.


#3

Hi. HCP Version 7.2.0.26 and OpenIO 16.10.


#4

Hi @jserra,

that documentation is pretty impressive!

Thanks and see you on Slack :wink: