Skip to content

Make the Kafka rules more generic. #1818

@manalibhutiyani

Description

@manalibhutiyani

Currently we have Kafka policy rules which are pretty low-level and pertaining to the Kafka protocol. This will be difficult/unnecessary for users. Instead of having apiKEY and everything relating to produce/fetch keys whitelisted, we should have a more generic produce topic / consume topic rule and leave the kafka protocol key handling to the cilium daemon.

Parent : #2594

Proposed structure:

// PortRuleKafka is a list of Kafka protocol constraints. All fields are
// optional, if all fields are empty or missing, the rule will match all
// Kafka messages.
type PortRuleKafka struct {
	// APIVersion is the version matched against the api version of the
	// Kafka message. If set, it has to be a string representing a positive
	// integer.
	//
	// If omitted or empty, all versions are allowed.
	//
	// +optional
	APIVersion string `json:"apiVersion,omitempty"`

	// APIKey is a case-insensitive string matched against the key of a
	// request, e.g. "produce", "fetch", "createtopic", "deletetopic", et al
	// Reference: https://kafka.apache.org/protocol#protocol_api_keys
	//
	// If omitted or empty, all keys are allowed.
	//
	// +optional
	APIKey string `json:"apiKey,omitempty"`

	// ClientID is the client identifier as provided in the request.
	//
	// From Kafka protocol documentation:
	// This is a user supplied identifier for the client application. The
	// user can use any identifier they like and it will be used when
	// logging errors, monitoring aggregates, etc. For example, one might
	// want to monitor not just the requests per second overall, but the
	// number coming from each client application (each of which could
	// reside on multiple servers). This id acts as a logical grouping
	// across all requests from a particular client.
	//
	// If omitted or empty, all client identifiers are allowed.
	//
	// +optional
	ClientID string `json:"clientID,omitempty"`

	// Topic is the topic name contained in the message. If a Kafka request
	// contains multiple topics, then all topics must be allowed or the
	// message will be rejected.
	//
	// This constraint is ignored if the matched request message type
	// doesn't contain any topic. Maximum size of Topic can be 249
	// characters as per recent Kafka spec and allowed characters are
	// a-z, A-Z, 0-9, -, . and _
	// Older Kafka versions had longer topic lengths of 255, but in Kafka 0.10
	// version the length was changed from 255 to 249. For compatibility
	// reasons we are using 255
	//
	// If omitted or empty, all topics are allowed.
	//
	// +optional
	Topic string `json:"topic,omitempty"`

	// Role is a case-insensitive string and describes a group of API keys
	// necessary to perform certain higher level Kafka operations such as "produce"
	// or "consume". An APIGroup automatically expands into all APIKeys required
	// to perform the specified higher level operation.
	//
	// The following values are supported:
	//  - "produce": Allow producing to the topics specified in the rule
	//  - "consume": Allow consuming from the topics specified in the rule
	//
	// This field is incompatible with the APIKey field, either APIKey or Role
	// may be specified.
	//
	// If omitted or empty, the field has no effect and the logic of the APIKey
	// field applies.
	//
	// +optional
	Role string `json:"Role,omitempty"`

	// --------------------------------------------------------------------
	// Private fields. These fields are used internally and are not exposed
	// via the API.

	// apiKeyInt is the integer representation of APIKey
	apiKeyInt KafkaRole

	// apiVersionInt is the integer representation of APIVersion
	apiVersionInt *int16
}
type KafkaRole []int16

The user can only specify either Role or APIKey. apiKeyInt of type KafkaRole is the private cached value which will either have a single element when APIKey is specified or a list of allowed elements if Role is specified. These allowed elements are basically a blowup of all the needed/supporting apiKeys for produce and consume. Rest of the logic stays the same.

Metadata

Metadata

Labels

area/proxyImpacts proxy components, including DNS, Kafka, Envoy and/or XDS servers.kind/enhancementThis would improve or streamline existing functionality.priority/highThis is considered vital to an upcoming release.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions