SlideShare a Scribd company logo
Jun Rao
Confluent, Inc
Securing	
  Apache	
  Ka/a	
  	
  
Outline
•  Kafka and security overview
•  Authentication
•  Identify the principal (user) associated with a connection
•  Authorization
•  What permission a principal has
•  Secure Zookeeper
•  Future stuff
What’s Apache Kafka
Distributed, high throughput pub/sub system
Kafka Usage
Security Overview
•  Support since 0.9.0
•  Wire encryption btw client and broker
•  For cross data center mirroring
•  Access control on resources such as topics
•  Enable sharing Kafka clusters
Authentication Overview
•  Broker support multiple ports
•  plain text (no wire encryption/authentication)
•  SSL (for wire encryption/authentication)
•  SASL (for Kerberos authentication)
•  SSL + SASL (SSL for wire encryption, SASL for authentication)
•  Clients choose which port to use
•  need to provide required credentials through configs
Why is SSL useful
•  1-way authentication
•  Secure wire transfer through encryption
•  2-way authentication
•  Broker knows the identity of client
•  Easy to get started
•  Just involve client and server
SSL handshake
Subsequent transfer over SSL
•  Data encrypted with agreed upon cipher suite
•  Encryption overhead
•  Losing zero-copy transfer in consumer
Performance impact with SSL
•  r3.xlarge
•  4 core, 30GB ram, 80GB ssd, moderate network (~90MB/s)
•  Most overhead from encryption
throughput	
  (MB/s)	
   CPU	
  on	
  client	
   CPU	
  on	
  broker	
  
producer	
  (plaintext)	
   83	
   12%	
   30%	
  
producer	
  (SSL)	
   69	
   28%	
   48%	
  
consumer	
  (plaintext)	
   83	
   8%	
   2%	
  
consumer	
  (SSL)	
   69	
   27%	
   24%	
  
Preparing SSL
1.  Generate certificate (X509) in broker key store
2.  Generate certificate authority (CA) for signing
3.  Sign broker certificate with CA
4.  Import signed certificate and CA to broker key store
5.  Import CA to client trust store
6.  2-way authentication: generate client certificate in a similar
way
Configuring SSL
ssl.keystore.location = /var/private/ssl/kafka.server.keystore.jks
ssl.keystore.password = test1234
ssl.key.password = test1234
ssl.truststore.location = /var/private/ssl/kafka.server.truststore.jks
ssl.truststore.password = test1234	
  
Client/Broker	
  
listeners = SSL://host.name:port
security.inter.broker.protocol = SSL
ssl.client.auth = required
security.protocol = SSL
Broker	
  
Client	
  
•  No client code change; just configuration change.
SSL Principal Name
•  By default, the distinguished name of the certificate
•  CN=host1.company.com,OU=organization
unit,O=organization,L=location,ST=state,C=country
•  Can be customized through principal.builder.class
•  Has access to X509Certificate
•  Make setting broker principal and application principal convenient
What is SASL
•  Simple Authentication and Security Layer
•  Challenge/response protocols
•  Server issues challenge and client sends response
•  Continue until server is satisfied
•  Different mechanisms
•  Plain: cleartext username/password
•  Digest MD5
•  GSSAPI: Kerberos
•  Kafka 0.9.0 only supports Kerberos
Why Kerberos
•  Secure single sign-on
•  An organization may provide multiple services
•  User just remember a single Kerberos password to use all services
•  More convenient when there are many users
•  Need Key Distribution Center (KDC)
•  Each service/user need a Kerberos principal in KDC
How Kerberos Works
•  Create service and client
principal in KDC
•  Client authenticate with AS
on startup
•  Client obtain service ticket
from TGS
•  Client authenticate with
service using service ticket
SASL handshake
Client Broker
ConnecHon	
  
Mechanism	
  list	
  
Selected	
  mechanism	
  &	
  sasl	
  data	
  
Evaluate	
  and	
  response	
  
Sasl	
  data	
  
Client	
  authenHcated	
  
Data transfer
•  SASL_PLAINTEXT
•  No wire encryption
•  SASL_SSL
•  Wire encryption over SSL
Preparing Kerberos
•  Create Kafka service principal in KDC
•  Create a keytab for Kafka principal
•  Keytab includes principal and encrypted Kerberos password
•  Allow authentication w/o typing password
•  Create an application principal for client KDC
•  Create a keytab for application principal
Configuring Kerberos
KafkaServer {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=true
keyTab="/etc/security/keytabs/kafka_server.keytab"
principal="kafka/kafka1.hostname.com@EXAMPLE.COM";
};
Broker	
  JAAS	
  file	
  
-Djava.security.auth.login.config=/etc/kafka/
kafka_server_jaas.conf
security.inter.broker.protocol=SASL_PLAINTEXT(SASL_SSL)
sasl.kerberos.service.name=kafka
Broker	
  JVM	
  
Broker	
  config	
  
•  No client code change; just configuration change.
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=true
keyTab="/etc/security/keytabs/kafka_client.keytab"
principal="kafka-client-1@EXAMPLE.COM";
};
Client	
  JAAS	
  file	
  
-Djava.security.auth.login.config=/etc/kafka/
kafka_client_jaas.conf
security.protocol=SASL_PLAINTEXT(SASL_SSL)
sasl.kerberos.service.name=kafka
ClientJVM	
  
Client	
  config	
  
Kerberos principal name
•  Kerberos principal
•  Primary[/Instance]@REALM
•  kafka/kafka1.hostname.com@EXAMPLE.COM
•  kafka-client-1@EXAMPLE.COM
•  Primary extracted as the default principal name
•  Can customize principal name through
sasl.kerberos.principal.to.local.rules
Authentication Caveat
•  Authentication (SSL or SASL) happens once during socket
connection
•  No re-authentication
•  If a certificate needs to be revoked, use authorization to remove
permission
Authorization
•  Control which permission each authenticated principal has
•  Pluggable with a default implementation
ACL
Principal Permission Operation Resource Host
Alice Allow Read Topic:T1 Host1
Alice	
  is	
  Allowed	
  to	
  Read	
  from	
  topic	
  T1	
  from	
  Host1	
  
Operations and Resources
•  Operations
•  Read, Write, Create, Describe, ClusterAction, All
•  Resources
•  Topic, Cluster and ConsumerGroup
Opera;ons	
   Resources	
  
Read,	
  Write,	
  Describe	
  (Read,	
  Write	
  implies	
  
Describe)	
  
Topic	
  
Read	
   ConsumerGroup	
  
Create,	
  ClusterAcHon	
  (communicaHon	
  between	
  
controller	
  and	
  brokers)	
  
Cluster	
  
SimpleAclAuthorizer
•  Out of box authorizer implementation.
•  CLI tool for adding/removing acls
•  ACLs stored in zookeeper and propagated to brokers
asynchronously
•  ACL cache in broker for better performance.
Client	
   Broker	
   Authorizer	
   Zookeeper	
  
configure	
  
Read	
  ACLs	
  
Load	
  Cache	
  
Request	
  
authorize	
  
ACL	
  match	
  
Or	
  Super	
  User?	
  
Allowed/
Denied	
  
Authorizer Flow
Configure broker ACL
•  authorizer.class.name=kafka.security.auth.SimpleAclAuthorizer
•  Make Kafka principal super users
•  Or grant ClusterAction and Read all topics to Kafka principal
Configure client ACL
•  Producer
•  Grant Write on topic, Create on cluster (auto creation)
•  Or use --producer option in CLI
bin/kafka-acls --authorizer-properties zookeeper.connect=localhost:2181 
--add --allow-principal User:Bob --producer --topic t1
•  Consumer
•  Grant Read on topic, Read on consumer group
•  Or use --consumer option in CLI
bin/kafka-acls --authorizer-properties zookeeper.connect=localhost:2181 
--add --allow-principal User:Bob --consumer --topic t1 --group group1
Secure Zookeeper
•  Zookeeper stores
•  critical Kafka metadata
•  ACLs
•  Need to prevent untrusted users from modifying
Zookeeper Security Integration
•  ZK supports authentication through SASL
•  Kerberos or Digest MD5
•  Set zookeeper.set.acl to true on every broker
•  Configure ZK user through JAAS config file
•  Each ZK path writable by creator, readable by all
Migrating from non-secure to secure
Kafka
•  Configure brokers with multiple ports
•  listeners=PLAINTEXT://host.name:port,SSL://host.name:port
•  Gradually migrate clients to secure port
•  When done
•  Turn off PLAINTEXT port on brokers
Migrating from non-secure to secure
Zookeeper
•  http://kafka.apache.org/documentation.html#zk_authz_migration
Future work
•  More SASL options: plain password, md5 digest
•  Performance improvement
•  Integrate with admin api
Thank you
Jun Rao | jun@confluent.io | @junrao
Meet Confluent in booth
Confluent University ~ Kafka training ~ confluent.io/training
Download Apache Kafka & Confluent Platform: confluent.io/download

More Related Content

Kafka 2018 - Securing Kafka the Right Way

  • 1. Jun Rao Confluent, Inc Securing  Apache  Ka/a    
  • 2. Outline •  Kafka and security overview •  Authentication •  Identify the principal (user) associated with a connection •  Authorization •  What permission a principal has •  Secure Zookeeper •  Future stuff
  • 3. What’s Apache Kafka Distributed, high throughput pub/sub system
  • 5. Security Overview •  Support since 0.9.0 •  Wire encryption btw client and broker •  For cross data center mirroring •  Access control on resources such as topics •  Enable sharing Kafka clusters
  • 6. Authentication Overview •  Broker support multiple ports •  plain text (no wire encryption/authentication) •  SSL (for wire encryption/authentication) •  SASL (for Kerberos authentication) •  SSL + SASL (SSL for wire encryption, SASL for authentication) •  Clients choose which port to use •  need to provide required credentials through configs
  • 7. Why is SSL useful •  1-way authentication •  Secure wire transfer through encryption •  2-way authentication •  Broker knows the identity of client •  Easy to get started •  Just involve client and server
  • 9. Subsequent transfer over SSL •  Data encrypted with agreed upon cipher suite •  Encryption overhead •  Losing zero-copy transfer in consumer
  • 10. Performance impact with SSL •  r3.xlarge •  4 core, 30GB ram, 80GB ssd, moderate network (~90MB/s) •  Most overhead from encryption throughput  (MB/s)   CPU  on  client   CPU  on  broker   producer  (plaintext)   83   12%   30%   producer  (SSL)   69   28%   48%   consumer  (plaintext)   83   8%   2%   consumer  (SSL)   69   27%   24%  
  • 11. Preparing SSL 1.  Generate certificate (X509) in broker key store 2.  Generate certificate authority (CA) for signing 3.  Sign broker certificate with CA 4.  Import signed certificate and CA to broker key store 5.  Import CA to client trust store 6.  2-way authentication: generate client certificate in a similar way
  • 12. Configuring SSL ssl.keystore.location = /var/private/ssl/kafka.server.keystore.jks ssl.keystore.password = test1234 ssl.key.password = test1234 ssl.truststore.location = /var/private/ssl/kafka.server.truststore.jks ssl.truststore.password = test1234   Client/Broker   listeners = SSL://host.name:port security.inter.broker.protocol = SSL ssl.client.auth = required security.protocol = SSL Broker   Client   •  No client code change; just configuration change.
  • 13. SSL Principal Name •  By default, the distinguished name of the certificate •  CN=host1.company.com,OU=organization unit,O=organization,L=location,ST=state,C=country •  Can be customized through principal.builder.class •  Has access to X509Certificate •  Make setting broker principal and application principal convenient
  • 14. What is SASL •  Simple Authentication and Security Layer •  Challenge/response protocols •  Server issues challenge and client sends response •  Continue until server is satisfied •  Different mechanisms •  Plain: cleartext username/password •  Digest MD5 •  GSSAPI: Kerberos •  Kafka 0.9.0 only supports Kerberos
  • 15. Why Kerberos •  Secure single sign-on •  An organization may provide multiple services •  User just remember a single Kerberos password to use all services •  More convenient when there are many users •  Need Key Distribution Center (KDC) •  Each service/user need a Kerberos principal in KDC
  • 16. How Kerberos Works •  Create service and client principal in KDC •  Client authenticate with AS on startup •  Client obtain service ticket from TGS •  Client authenticate with service using service ticket
  • 17. SASL handshake Client Broker ConnecHon   Mechanism  list   Selected  mechanism  &  sasl  data   Evaluate  and  response   Sasl  data   Client  authenHcated  
  • 18. Data transfer •  SASL_PLAINTEXT •  No wire encryption •  SASL_SSL •  Wire encryption over SSL
  • 19. Preparing Kerberos •  Create Kafka service principal in KDC •  Create a keytab for Kafka principal •  Keytab includes principal and encrypted Kerberos password •  Allow authentication w/o typing password •  Create an application principal for client KDC •  Create a keytab for application principal
  • 20. Configuring Kerberos KafkaServer { com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true storeKey=true keyTab="/etc/security/keytabs/kafka_server.keytab" principal="kafka/[email protected]"; }; Broker  JAAS  file   -Djava.security.auth.login.config=/etc/kafka/ kafka_server_jaas.conf security.inter.broker.protocol=SASL_PLAINTEXT(SASL_SSL) sasl.kerberos.service.name=kafka Broker  JVM   Broker  config   •  No client code change; just configuration change. KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true storeKey=true keyTab="/etc/security/keytabs/kafka_client.keytab" principal="[email protected]"; }; Client  JAAS  file   -Djava.security.auth.login.config=/etc/kafka/ kafka_client_jaas.conf security.protocol=SASL_PLAINTEXT(SASL_SSL) sasl.kerberos.service.name=kafka ClientJVM   Client  config  
  • 21. Kerberos principal name •  Kerberos principal •  Primary[/Instance]@REALM •  kafka/[email protected] •  [email protected] •  Primary extracted as the default principal name •  Can customize principal name through sasl.kerberos.principal.to.local.rules
  • 22. Authentication Caveat •  Authentication (SSL or SASL) happens once during socket connection •  No re-authentication •  If a certificate needs to be revoked, use authorization to remove permission
  • 23. Authorization •  Control which permission each authenticated principal has •  Pluggable with a default implementation
  • 24. ACL Principal Permission Operation Resource Host Alice Allow Read Topic:T1 Host1 Alice  is  Allowed  to  Read  from  topic  T1  from  Host1  
  • 25. Operations and Resources •  Operations •  Read, Write, Create, Describe, ClusterAction, All •  Resources •  Topic, Cluster and ConsumerGroup Opera;ons   Resources   Read,  Write,  Describe  (Read,  Write  implies   Describe)   Topic   Read   ConsumerGroup   Create,  ClusterAcHon  (communicaHon  between   controller  and  brokers)   Cluster  
  • 26. SimpleAclAuthorizer •  Out of box authorizer implementation. •  CLI tool for adding/removing acls •  ACLs stored in zookeeper and propagated to brokers asynchronously •  ACL cache in broker for better performance.
  • 27. Client   Broker   Authorizer   Zookeeper   configure   Read  ACLs   Load  Cache   Request   authorize   ACL  match   Or  Super  User?   Allowed/ Denied   Authorizer Flow
  • 28. Configure broker ACL •  authorizer.class.name=kafka.security.auth.SimpleAclAuthorizer •  Make Kafka principal super users •  Or grant ClusterAction and Read all topics to Kafka principal
  • 29. Configure client ACL •  Producer •  Grant Write on topic, Create on cluster (auto creation) •  Or use --producer option in CLI bin/kafka-acls --authorizer-properties zookeeper.connect=localhost:2181 --add --allow-principal User:Bob --producer --topic t1 •  Consumer •  Grant Read on topic, Read on consumer group •  Or use --consumer option in CLI bin/kafka-acls --authorizer-properties zookeeper.connect=localhost:2181 --add --allow-principal User:Bob --consumer --topic t1 --group group1
  • 30. Secure Zookeeper •  Zookeeper stores •  critical Kafka metadata •  ACLs •  Need to prevent untrusted users from modifying
  • 31. Zookeeper Security Integration •  ZK supports authentication through SASL •  Kerberos or Digest MD5 •  Set zookeeper.set.acl to true on every broker •  Configure ZK user through JAAS config file •  Each ZK path writable by creator, readable by all
  • 32. Migrating from non-secure to secure Kafka •  Configure brokers with multiple ports •  listeners=PLAINTEXT://host.name:port,SSL://host.name:port •  Gradually migrate clients to secure port •  When done •  Turn off PLAINTEXT port on brokers
  • 33. Migrating from non-secure to secure Zookeeper •  http://kafka.apache.org/documentation.html#zk_authz_migration
  • 34. Future work •  More SASL options: plain password, md5 digest •  Performance improvement •  Integrate with admin api
  • 35. Thank you Jun Rao | [email protected] | @junrao Meet Confluent in booth Confluent University ~ Kafka training ~ confluent.io/training Download Apache Kafka & Confluent Platform: confluent.io/download