Open Source Identity Solution for Applications, Services, and APIs

In this post, we will see the core concept of Keycloak and application integration mechanisms. You can integrate frontend, mobile, monolithic application to microservice architecture. It gives the flexibility to export and import configuration easily and gives a single view to manage everything.

Why I should use Keycloak?

  • Reliable Solution

“Red Hat running on Red Hat products (Red Hat SSO)”: the entire authentication/authorization system is based on Red Hat SSO, which is the downstream version of upstream product keycloak. …


This post will do a step-by-step configuration of the strimzi-operator (Apache Kafka) on Openshift. Expose an external listener as Openshift route over TLS and Securing the Kafka Cluster using Keycloak using SASL OAuth Bearer.

If you don’t want to do a bunch of configurations. An easy option for you would be: Openshift Streams for Apache Kafka https://www.redhat.com/en/blog/introducing-red-hat-openshift-streams-apache-kafka

Let’s begin the Journey of securing Kafka-kittens. if you aren’t familiar with SASL for OAuth.

SASL OAuth Bearer:

https://datatracker.ietf.org/doc/html/rfc7628 [Simple Authentication and Security Layer (SASL) Mechanisms for OAuth]

OAuth 2.0 Protocol Flow

Overview

Running Apache Kafka deployed using Strimzi operator and Keycloak using Keycloak operator on Openshift. …


In this post, we will be going over how to configure the PostgreSQL database with Keycloak. Keycloak standard distribution comes with H2 embedded database. The process discussed in this post can be used for other supported databases like Mysql.

Prerequisites

Github repository: keycloak-integration

git clone https://github.com/akoserwal/keycloak-integrations.git

Setup

Postgres Module Configuration

For EAP/Wildfly distribution. it follows the Java convention for the package structure. Like a reverse domain name.

Let’s name the module as com.postgres and the folder structure would look like this:

com│   ├── postgres│   │   └── main│   │       ├── module.xml│…

In this post, we will see using Keycloak as an Identity Provider for your Openshift cluster act as an Identity Broker. Visa versa can be possible. You can use Openshift as a provider for the Keycloak. We won’t be covering that scenario in this post. In simple terms, keycloak users can log in to the Openshift cluster.

Figure: 1.1 Keycloak (Identity Provider) for the Openshift cluster

As shown in the flow diagram(Figure 1.1). Once you configure the Identity Provider in the Openshift instance. You will see an option appeared on the login screen. Using the keycloak(as an open-id provider). Keycloak users will be able to access the openshift cluster…


In this post, we will see the process of designing an operator for your product with hands-on experience. This post is for people who are familiar with the operator pattern & may have tried operator-sdk/kubebuilder. If you aren’t please read my other post: Operator Pattern: Kubernetes/Openshift

The main challenge for people who are somewhere in the beginner/intermediate level. They have tried the getting started example, yet writing a working operator for their own product is still a learning curve. After this going through the workshop scenario. You will be able to achieve that goal.

Let’s break down the process of…


In this post, we will do a step-by-step configuration of the strimzi-operator & use Openshift routes as an external listener with security: SASL_SSL.

Setup

Installation:

OLM: One-click install

Locally

git clone https://github.com/strimzi/strimzi-kafka-operator.git
cd strimzi-kafka-operator

Update the namespace in which you wanted to deploy the Operator.

sed -i '' 's/namespace: .*/namespace: kafka/' install/cluster-operator/*RoleBinding*.yaml

Login & Create the namespace/project

oc login <cluster>
oc new-project <projectname> //kafka as project name

Create the Cluster-Operator in the namespace: kafka

kubectl apply -f install/cluster-operator -n kafka

Check the status of the operator pod

oc get pods -n kafka

Creating Kafka Cluster

Whether you have used…


In this post, we will understand the concept of using Keycloak as an identity Broker & an Identity Provider. Let’s go over the basic flow before moving ahead.

Keycloak as an Identity Broker & as an Identity Provider
  • Identity Provider: An application or system that manages identity information. Allow you to create & manage it.

Eg:- You can read another post which explains using GitHub as a Social Identity provider: Github as Identity Provider in Keycloak

  • Identity Broker: an intermediatory service that lets you connect with the Identity Providers.

Eg:- Broker lets you authenticate or authorize using Identity Provider and let you use the resource linked with Broker. …


In this post, we will build a basic SPI (Service Provider Interfaces), event listener & publish the events to Kafka. Keycloak SPI allows us to add or customize built-in functionality. Understanding of Kafka is required only for the part where we need to publish the events, if you just want to understand about build SPI, you can skip the Kafka part.

Setup

Code Repo & Build Steps

git clone https://github.com/akoserwal/keycloak-integrations.git
cd keycloak-spi-kafka/
# build
mvn clean install

WARNING: Deploying an SPI jar with some issue/exception can cause failure with your Keycloak Server run. Also if you register the SPI…

Abhishek koserwal

#redhatter #opensource #developer #kubernetes #keycloak #golang #openshift #quarkus #spring

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store