Conformance Testing¶
Scenario: Conformance testing
Conformance testing enables you to validate a version of your app/ML model using service-level objectives (SLOs). In this tutorial, you will:
- Perform conformance testing.
- Specify latency and error-rate based service-level objectives (SLOs). If your version satisfies SLOs, Iter8 will declare it as the winner.
Before you begin...
This tutorial is available for the following K8s stacks.
Please choose the same K8s stack consistently throughout this tutorial. If you wish to switch K8s stacks between tutorials, start from a clean K8s cluster, so that your cluster is correctly setup.
Steps 1, 2, and 3¶
Please follow steps 1, 2, and 3 of the quick start tutorial.
4. Create app/ML model version¶
Deploy bookinfo app:
kubectl apply -n bookinfo-iter8 -f $ITER8/samples/istio/conformance/bookinfo-app.yaml
Look inside productpage-v1 defined in bookinfo-app.yaml
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 |
|
Deploy a Knative app.
kubectl apply -f $ITER8/samples/knative/conformance/baseline.yaml
Look inside baseline.yaml
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
|
5. Generate requests¶
Generate requests using Fortio as follows.
kubectl wait -n bookinfo-iter8 --for=condition=Ready pods --all
# URL_VALUE is the URL of the `bookinfo` application
URL_VALUE="http://$(kubectl -n istio-system get svc istio-ingressgateway -o jsonpath='{.spec.clusterIP}'):80/productpage"
sed "s+URL_VALUE+${URL_VALUE}+g" $ITER8/samples/istio/quickstart/fortio.yaml | kubectl apply -f -
Look inside fortio.yaml
apiVersion: batch/v1
kind: Job
metadata:
name: fortio
spec:
template:
spec:
volumes:
- name: shared
emptyDir: {}
containers:
- name: fortio
image: fortio/fortio
command: [ 'fortio', 'load', '-t', '6000s', '-qps', "16", '-json', '/shared/fortiooutput.json', '-H', 'Host: bookinfo.example.com', "$(URL)" ]
env:
- name: URL
value: URL_VALUE
volumeMounts:
- name: shared
mountPath: /shared
- name: busybox
image: busybox:1.28
command: ['sh', '-c', 'echo busybox is running! && sleep 6000']
volumeMounts:
- name: shared
mountPath: /shared
restartPolicy: Never
Generation of requests is handled automatically by the Iter8 experiment.
6. Define metrics¶
Please follow step 6 of the quick start tutorial.
Metrics collection is handled automatically by the Iter8 experiment.
7. Launch experiment¶
Launch the Iter8 experiment that orchestrates conformance testing for the app/ML model in this tutorial.
kubectl apply -f $ITER8/samples/istio/conformance/experiment.yaml
Look inside experiment.yaml
apiVersion: iter8.tools/v2alpha2
kind: Experiment
metadata:
name: conformance-exp
spec:
# target identifies the service under experimentation using its fully qualified name
target: bookinfo-iter8/productpage
strategy:
# this experiment will perform a Conformance test
testingPattern: Conformance
criteria:
objectives: # used for validating versions
- metric: iter8-istio/mean-latency
upperLimit: 100
- metric: iter8-istio/error-rate
upperLimit: "0.01"
requestCount: iter8-istio/request-count
duration: # product of fields determines length of the experiment
intervalSeconds: 10
iterationsPerLoop: 10
versionInfo:
# information about the app versions used in this experiment
baseline:
name: productpage-v1
variables:
- name: namespace # used by final action if this version is the winner
value: bookinfo-iter8
The process automated by Iter8 during this experiment is depicted below.
kubectl apply -f $ITER8/samples/knative/conformance/experiment.yaml
Look inside experiment.yaml
apiVersion: iter8.tools/v2alpha2
kind: Experiment
metadata:
name: conformance-exp
spec:
# target identifies the knative service under experimentation using its fully qualified name
target: default/sample-app
strategy:
# this experiment will perform a conformance test
testingPattern: Conformance
actions:
loop:
- task: metrics/collect
with:
versions:
- name: sample-app-v1
url: http://sample-app.default.svc.cluster.local
criteria:
objectives:
- metric: iter8-system/mean-latency
upperLimit: 50
- metric: iter8-system/error-count
upperLimit: 0
duration:
maxLoops: 10
intervalSeconds: 1
iterationsPerLoop: 1
versionInfo:
# information about app versions used in this experiment
baseline:
name: sample-app-v1
The process automated by Iter8 during this experiment is depicted below.
8. Observe experiment¶
Follow step 8 of quick start tutorial to observe the experiment in realtime. Note that the experiment in this tutorial uses a different name from the quick start one. Replace the experiment name quickstart-exp
with conformance-exp
in your commands.
Understanding what happened
- You created a single version of an app/ML model.
- You generated requests for your app/ML model versions.
- You created an Iter8 experiment with conformance testing pattern. In each iteration, Iter8 observed the latency and error-rate metrics for your application; Iter8 verified that the version (referred to as baseline in a conformance experiment) satisfied all the SLOs, and identified baseline as the winner.
9. Cleanup¶
kubectl delete -f $ITER8/samples/istio/conformance/fortio.yaml
kubectl delete -f $ITER8/samples/istio/conformance/experiment.yaml
kubectl delete ns bookinfo-iter8
kubectl delete -f $ITER8/samples/knative/conformance/experiment.yaml
kubectl delete -f $ITER8/samples/knative/conformance/baseline.yaml