Un giro su OKD 4 (parte 1)

L’impatto del ritorno dopo le vacanze è stato tragico. Avendo dimenticato tutto in merito a cosa fa un DevOps ho iniziato con un’attività molto leggera. Giocare con OKD 4, la versione community da cui deriva Openshift.

Già da un po’ di tempo è stato annunciato OKD 4 https://www.openshift.com/blog/okd4-is-now-generally-available e messo a disposizione della community un installer per provare il prodotto in locale. Mi ero abituato con Minishift o a farmi un cluster a due nodi (un master e un worker, 2 vm Vagrant e via con i playbook di installazione) ma devo dire che crc (Code Ready Containers) è molto comodo.

All’interno del repo GitHub del progetto (https://github.com/code-ready/crc) si trova un link a https://cloud.redhat.com/openshift/install/crc/installer-provisioned per scaricare ed installare il tool.

Una volta installato (fate in modo che sia in PATH, ho optato per un classico link simbolico /usr/local/bin/crc -> /Users/foobar/WORK/Openshift4/crc/crc) basta dare semplicemente crc setup e crc start. Non ha molto senso entrare in ssh nell’istanza di CoreOS di OKD 4 installata con crc ma per curiosità vi scrivo che si può fare così:

~$ ssh -i /Users/foobar/.crc/machines/crc/id_rsa core@"$(crc ip)"
Red Hat Enterprise Linux CoreOS 45.82.202007240629-0
  Part of OpenShift 4.5, RHCOS is a Kubernetes native operating system
  managed by the Machine Config Operator (`clusteroperator/machine-config`).

WARNING: Direct SSH access to machines is not recommended; instead,
make configuration changes via `machineconfig` objects:
  https://docs.openshift.com/container-platform/4.5/architecture/architecture-rhcos.html

---
Last login: Sat Aug 22 17:50:00 2020 from 192.168.64.1
[core@crc-fd5nx-master-0 ~]$ cat /etc/redhat-release
Red Hat Enterprise Linux CoreOS release 4.5

Una volta partita la vm, in output verrà mostrato come accedere da riga di comando tramite oc. Con oc console si può comunque reperire la password in qualsiasi momento

~$ crc console --credentials
To login as a regular user, run 'oc login -u developer -p developer https://api.crc.testing:6443'.
To login as an admin, run 'oc login -u kubeadmin -p DhjTx-8gIJC-2h2tK-eksGY https://api.crc.testing:6443'

Tramite crc console secco, invece, si aprirà magicamente la console di Openshift sul browser. A prima vista ci sono un sacco di cose che mi piacciono in confronto alla versione 3.11.

Feature interessanti da un primo sguardo alla web-console

  • La view divisa tra administrator e developer
  • La search bar per le resource e gli event. Molto utile anche se era più divertente fare dei bashoni con oc
  • La sezione OperatorHub
  • La sezione workloads per administrator è cross-namespace. Non c’è più bisogno di entrare nel namespace per vederne le risorse (questo è molto comodo, si fa troubleshooting più in fretta)
  • Nella sezione storage ci sono i PV. Da paura. Ci sono anche le storage-class. Mi spiace che tra i provisioner non ci sia quello per OpenEBS (https://openebs.io/)
  • Ok la parte compute è veramente utile (devo provarla però). La feature più interessante di questa sezione è il MachineAutoscaler per cui ciao ciao scale-up manuale (peccato era la cosa più facile lo scale-up tramite Ansible, su OKD. L’aggiornamento invece, su OKD e non su Openshift, era un cosiddetto bagno di sangue.
  • Sezione OAuths in Cluster Settings. È possibile configurare vari identity provider direttamente da web console

Aggiunta identity provider HTPasswd

# Creazione password file con htpasswd 
htpasswd -b -c fooobar foo bar

Upload del file tramite web console

Prova di login

$ oc login -u foo -p bar https://api.crc.testing:6443
Login successful.

You don't have any projects. You can try to create a new project, by running

    oc new-project <projectname>

Direi che OKD4, da un primo sguardo, ci piace parecchio ma ci sono molti aspetti da verificare e funzionalità da provare.

crc è sicuramente utilissimo per gli sviluppatori ma anche per i DevOps in caso vogliano dare uno sguardo.

Bella.

Docker images pruning with Openshift Origin

openshift-card

In order to test the Docker images pruning with Openshift Origin I made the following test.

Environment:

  • Two nodes, one master + infra and a workload node.
  • Cluster version: Openshift Origin 3.6

First of all I have built a Docker image based on NodeJS within a large file. The file.out is of 50M and after compression it decreases little.

Structure of my NodeJS app

[root@origin-master nodeapp]# ls -lhas

total 51M

 0 drwxr-xr-x. 2 root root 106 May 27 16:41 .

 0 drwx------. 6 vagrant vagrant 228 May 27 09:09 ..

4.0K -rw-r--r--. 1 root root 136 May 27 16:22 Dockerfile

4.0K -rw-r--r--. 1 root root 261 May 27 16:18 create_fake_images.sh

 50M -rw-r--r--. 1 root root 50M May 27 16:18 file.out

4.0K -rw-r--r--. 1 root root 265 May 27 09:09 package.json

4.0K -rw-r--r--. 1 root root 539 May 27 15:19 server.js

The Dockerfile is very simple!

[root@origin-master nodeapp]# cat Dockerfile

FROM node

# Create app directory

WORKDIR /usr/src/app

COPY package*.json ./

RUN npm install

COPY . .

EXPOSE 8080

CMD [ "npm", "start" ]

The following script helped me to generate many images with additional and differents layers. Each iterations copy the 50M file inside the image.

for i in $(seq 1 1000)

do

 echo "RUN echo $i" >> Dockerfile

 echo "COPY file.out ./$i.out" >> Dockerfile

 docker build -t docker-registry-default.origin.local/test/nodeapp:latest .

 docker push docker-registry-default.origin.local/test/nodeapp:latest

done

Let's see how the Docker registry folder increase size during create_fake_images.sh execution.

1a4c7633d886: Pushing [=======> ] 8.325 MB/52.43 MB

e36035decad2: Pushing [=======> ] 7.768 MB/52.43 MB

bb1e26b5f124: Pushing [=======> ] 7.768 MB/52.43 MB

13c3d2712668: Pushing [===========> ] 11.67 MB/52.43 MB

dbc4876ab96c: Pushing [=======> ] 8.327 MB/52.43 MB

[root@origin-master vagrant]# du -chs /data/origin/

 782M /data/origin/

 782M total

[root@origin-master vagrant]# du -chs /data/origin/

 956M /data/origin/

 956M total

[root@origin-master vagrant]# du -chs /data/origin/

 1.2G /data/origin/

 1.2G total

Describing image stream nodeapp we can see the latest images pushed to the registry

[root@origin-master nodeapp]# oc describe is

Name: nodeapp

Namespace: test

Created: 2 hours ago

Labels: <none>

Annotations: <none>

Docker Pull Spec: docker-registry.default.svc:5000/test/nodeapp

Image Lookup: local=false

Unique Images: 5

Tags: 1

latest

 pushed image

* docker-registry.default.svc:5000/test/nodeapp@sha256:6f517f2bd667280587daebd57c456c723df4e97d72903100cad441821203e4ec

 22 seconds ago

 docker-registry.default.svc:5000/test/nodeapp@sha256:421155780252a908de9c4968c65a508c65b12b259a6248278b73dd20edee20fb

 About a minute ago

 docker-registry.default.svc:5000/test/nodeapp@sha256:b26f1d2cac95a73632891a6cfec875baed0b1a4165c381e59e0ce4d1bfc403f9

 About a minute ago

 docker-registry.default.svc:5000/test/nodeapp@sha256:6d6f63093aae64198fb1d7af2cd2a361cec991817c8e6944910cb84420a52c1b

 20 minutes ago

 docker-registry.default.svc:5000/test/nodeapp@sha256:6e82ae61a154788ff70ff3ed69cf3a088845e0c7c2d1441de4123c213a0f0116

 23 minutes ago

I changed the Dockerfile in order to have an image that does not import large file. So I have built and pushed the following Docker image

[root@origin-master nodeapp]# cat Dockerfile

FROM node

# Create app directory

WORKDIR /usr/src/app

COPY package*.json ./

RUN npm install

COPY . .

EXPOSE 8080

CMD [ "npm", "start" ]

Let’s start to prune

tree-pruning1

  1. Delete old deployments
[root@origin-master nodeapp]# oc adm prune deployments --orphans --keep-complete=1 --keep-failed=1 --keep-younger-than=1m

Dry run enabled - no modifications will be made. Add --confirm to remove deployments

NAMESPACE NAME

test nodeapp2-23

test nodeapp2-22

test nodeapp2-21

test nodeapp2-20
  1. Prune the unused images and keep only one revision
[root@origin-master nodeapp]# oc adm prune images --keep-tag-revisions=1 --keep-younger-than=1m --confirm

Deleting references from image streams to images ...

STREAM IMAGE TAGS

test/nodeapp sha256:6e82ae61a154788ff70ff3ed69cf3a088845e0c7c2d1441de4123c213a0f0116 latest

test/nodeapp sha256:421155780252a908de9c4968c65a508c65b12b259a6248278b73dd20edee20fb latest

test/nodeapp sha256:b26f1d2cac95a73632891a6cfec875baed0b1a4165c381e59e0ce4d1bfc403f9 latest

Deleting registry repository layer links ...

REPO LAYER LINK

test/nodeapp sha256:3de138bf364bf3e2684c78468d07a0e2ca786ba08f83bd1b7e3373a1e0b407e5

test/nodeapp sha256:a75ed4d808fc563081de632debf91580a6dbe6d694971ca6888b5be4433f55cc

test/nodeapp sha256:677e3bfb00ad311212b46be4ddb20f5b762765fef676c6d4a85e5dbcb943c4a4

test/nodeapp sha256:29510362ff8e174edf88563f6099141b9e82efd5cd48b14aeaa74fea532e6d43

test/nodeapp sha256:35dc109c74dfcab9f1f027ccc1404e7ef3f524d6efea226e3920959051819f2c

test/nodeapp sha256:f3d40164b23c42356d717da92b91774402632a8df22cd5ddfb7926fc3a7292f6

test/nodeapp sha256:fe03b24c2c84648d09331dbd72a8065b8ed550be991155edc72ad1166f1ef666

test/nodeapp sha256:8eaf1a821a6b8a2e325cebce83311c5d3b33427140d84954f504ed59bab51109

test/nodeapp sha256:684c30e67a270d2115cb4be1c9712193bf0b4393ffa04b3c625e76fadd6bda12

test/nodeapp sha256:7d6ad63a8e94a938a95fa1e27106bbc7ecfa3680b3ff3a11c7bdb43ae610eb18

test/nodeapp sha256:e1614bf81372b99eb86c74ac9c023c01077061db81c4daeaf6683e27299e48cc

test/nodeapp sha256:a92bff6ede96a89a94fe2ca794115e1c374312d05e895698e6b1638c3b647645

test/nodeapp sha256:2d9ccccdee639671c6cb24110e529b738535453a3ccdf68989ac31ea6894929d

test/nodeapp sha256:1e920f3f2676bc50c60a95c9d8710cb91429ab07278616c02680b0f38af2d224

test/nodeapp sha256:90b58c4583c3de52b6f6762c83a02d42dd18747a2541bafb483a9cdbc5e55f8b

test/nodeapp sha256:d041de45e8895ee77b4e6ba112959e361d888b5694910b44f54fb88f0ef3fb4f

Deleting registry layer blobs ...

BLOB

sha256:3de138bf364bf3e2684c78468d07a0e2ca786ba08f83bd1b7e3373a1e0b407e5

sha256:a75ed4d808fc563081de632debf91580a6dbe6d694971ca6888b5be4433f55cc

sha256:677e3bfb00ad311212b46be4ddb20f5b762765fef676c6d4a85e5dbcb943c4a4

sha256:29510362ff8e174edf88563f6099141b9e82efd5cd48b14aeaa74fea532e6d43

sha256:35dc109c74dfcab9f1f027ccc1404e7ef3f524d6efea226e3920959051819f2c

sha256:f3d40164b23c42356d717da92b91774402632a8df22cd5ddfb7926fc3a7292f6

sha256:fe03b24c2c84648d09331dbd72a8065b8ed550be991155edc72ad1166f1ef666

sha256:8eaf1a821a6b8a2e325cebce83311c5d3b33427140d84954f504ed59bab51109

sha256:684c30e67a270d2115cb4be1c9712193bf0b4393ffa04b3c625e76fadd6bda12

sha256:7d6ad63a8e94a938a95fa1e27106bbc7ecfa3680b3ff3a11c7bdb43ae610eb18

sha256:e1614bf81372b99eb86c74ac9c023c01077061db81c4daeaf6683e27299e48cc

sha256:a92bff6ede96a89a94fe2ca794115e1c374312d05e895698e6b1638c3b647645

sha256:2d9ccccdee639671c6cb24110e529b738535453a3ccdf68989ac31ea6894929d

sha256:1e920f3f2676bc50c60a95c9d8710cb91429ab07278616c02680b0f38af2d224

sha256:90b58c4583c3de52b6f6762c83a02d42dd18747a2541bafb483a9cdbc5e55f8b

sha256:d041de45e8895ee77b4e6ba112959e361d888b5694910b44f54fb88f0ef3fb4f

Deleting registry repository manifest data ...

REPO IMAGE

test/nodeapp sha256:6e82ae61a154788ff70ff3ed69cf3a088845e0c7c2d1441de4123c213a0f0116

test/nodeapp sha256:421155780252a908de9c4968c65a508c65b12b259a6248278b73dd20edee20fb

test/nodeapp sha256:b26f1d2cac95a73632891a6cfec875baed0b1a4165c381e59e0ce4d1bfc403f9

Deleting images from server ...

IMAGE

sha256:6e82ae61a154788ff70ff3ed69cf3a088845e0c7c2d1441de4123c213a0f0116

sha256:421155780252a908de9c4968c65a508c65b12b259a6248278b73dd20edee20fb

sha256:b26f1d2cac95a73632891a6cfec875baed0b1a4165c381e59e0ce4d1bfc403f9

Check the size of  registry folder

[root@origin-master nodeapp]# du -chs /data/origin/
608M /data/origin/
608M total

Ok from 1.6gb we went down to 608M !