0% found this document useful (0 votes)
21 views4 pages

Confluent

The document outlines a series of commands and configurations for managing Kafka and Confluent Cloud resources, including creating API keys, enabling schema registries, and configuring connectors. It also includes personal background information about an individual named Happy Wandja Paul, detailing their education and work experience in data engineering and related fields. The document emphasizes the importance of tools like Kafka, Airflow, and Spark in data management and engineering roles.

Uploaded by

drivesankofa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views4 pages

Confluent

The document outlines a series of commands and configurations for managing Kafka and Confluent Cloud resources, including creating API keys, enabling schema registries, and configuring connectors. It also includes personal background information about an individual named Happy Wandja Paul, detailing their education and work experience in data engineering and related fields. The document emphasizes the importance of tools like Kafka, Airflow, and Spark in data management and engineering roles.

Uploaded by

drivesankofa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 4

sudo service mysql stop

fbb7807b6d8a27520f72432a7cce8ab5

yald@yald:~/Documents/kafka_learn/learn-kafka-courses$ confluent api-key create --


resource cloud
It may take a couple of minutes for the API key to be ready.
Save the API key and secret. The secret is not retrievable later.
+---------+------------------------------------------------------------------+
| API Key | RWJYFDGBJQDIZQQ2 |
| Secret | IzHYuCsXCJcVsHCgd+E99GNS63mjxCjaEQJ9LoIQoPfO4aD3g4ukkvy/QREYLc28 |
+---------+------------------------------------------------------------------+

yald@yald:~/Documents/kafka_learn/learn-kafka-courses$ confluent api-key create --


resource lkc-vrn9j0
It may take a couple of minutes for the API key to be ready.
Save the API key and secret. The secret is not retrievable later.
+---------+------------------------------------------------------------------+
| API Key | PKBQZVWJ5AHTSTES |
| Secret | 1E71n5TJQcn7bdwH3O+93LSBoLJ+HRzpJtJ3JamImQKLpm0x2jE9Gxk0vGvEL+mF |
+---------+------------------------------------------------------------------+

confluent schema-registry cluster enable --cloud AZURE --geo eu


+--------------+-----------------------------------------------------+
| Id | lsrc-8wv72m |
| Endpoint URL | https://psrc-d88yz.westeurope.azure.confluent.cloud |
+--------------+-----------------------------------------------------+

confluent api-key create --resource lsrc-8wv72m


It may take a couple of minutes for the API key to be ready.
Save the API key and secret. The secret is not retrievable later.
+---------+------------------------------------------------------------------+
| API Key | 74M47DWNL7UQUL4E |
| Secret | oTifGHPNFPrP2c5hBRkl9iCfwb/9sZdsgGqGLpPwIUmSHETeVUP9ewz2TJUB5ymg |
+---------+------------------------------------------------------------------+

# Required connection configs for Kafka producer, consumer, and admin


bootstrap.servers=pkc-4nmjv.francecentral.azure.confluent.cloud:9092
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required
username='PKBQZVWJ5AHTSTES'
password='1E71n5TJQcn7bdwH3O+93LSBoLJ+HRzpJtJ3JamImQKLpm0x2jE9Gxk0vGvEL+mF';
sasl.mechanism=PLAIN
# Required for correctness in Apache Kafka clients prior to 2.6
client.dns.lookup=use_all_dns_ips

# Best practice for higher availability in Apache Kafka clients prior to 3.0
session.timeout.ms=45000

# Best practice for Kafka producer to prevent data loss


acks=all

# Required connection configs for Confluent Cloud Schema Registry


schema.registry.url=https://psrc-d88yz.westeurope.azure.confluent.cloud
basic.auth.credentials.source=USER_INFO
basic.auth.user.info=74M47DWNL7UQUL4E:oTifGHPNFPrP2c5hBRkl9iCfwb/
9sZdsgGqGLpPwIUmSHETeVUP9ewz2TJUB5ymg
curl --request POST \
--url 'https://pkc-6ojv2.us-west4.gcp.confluent.cloud:443/kafka/v3/clusters/env-
57kgoz/topics' \
--header 'Authorization: Basic '$CLUSTER_AUTH64'' \
--header 'content-type: application/json' \
--data '{
"topic_name": "transactions",
"partitions_count": 6,
"replication_factor": 3 }'

curl --request POST --url 'https://pkc-


4nmjv.francecentral.azure.confluent.cloud:443/kafka/v3/clusters/lkc-vrn9j0/topics'
--header 'Authorization: Basic '$CLUSTER_AUTH64'' --header 'content-type:
application/json' --data '{
"topic_name": "transactions",
"partitions_count": 6,
"replication_factor": 3 }'

curl --request GET \


--url 'https://pkc-4nmjv.francecentral.azure.confluent.cloud:443/kafka/v3/
clusters/lkc-vrn9j0/topics' \
--header 'Authorization: Basic '$CLUSTER_AUTH64'' | jq '.'

curl --request GET \


--url 'https://api.confluent.cloud/connect/v1/environments/env-57kgoz/clusters/
lkc-vrn9j0/connector-plugins' \
--header 'Authorization: Basic '$CLOUD_AUTH64'' | jq '.'

curl --request PUT \


--url 'https://api.confluent.cloud/connect/v1/environments/env-57kgoz/clusters/
lkc-vrn9j0/connectors/DatagenSourceConnector_2/config' \
--header 'Authorization: Basic '$CLOUD_AUTH64'' \
--header 'content-type: application/json' \
--data '{
"connector.class": "DatagenSource",
"name": "DatagenSourceConnector_2",
"kafka.api.key": "'$CLOUD_KEY'",
"kafka.api.secret": "'$CLOUD_SECRET'",
"kafka.topic": "transactions",
"output.data.format": "AVRO",
"quickstart": "TRANSACTIONS",
"tasks.max": "1"
}'

CLOUD_SECRET=IzHYuCsXCJcVsHCgd+E99GNS63mjxCjaEQJ9LoIQoPfO4aD3g4ukkvy/QREYLc28
CLOUD_KEY=RWJYFDGBJQDIZQQ2

curl --request GET \


--url 'https://api.confluent.cloud/connect/v1/environments/env-57kgoz/clusters/
lkc-vrn9j0connectors/DatagenSourceConnector_2/status' \
--header 'Authorization: Basic '$CLOUD_AUTH64'' | jq '.'

curl --request DELETE \


--url 'https://api.confluent.cloud/connect/v1/environments/env-57kgoz/clusters/
lkc-vrn9j0/connectors/DatagenSourceConnector_2' \
--header 'Authorization: Basic '$CLOUD_AUTH64''

curl --request PUT \


--url
'https://api.confluent.cloud/connect/v1/environments/env-57kgoz/clusters/lkc-
vrn9j0/connectors/MySqlSinkConnector_2/config' \
--header 'Authorization: Basic '$CLOUD_AUTH64'' \
--header 'content-type: application/json' \
--data '{
"connector.class": "MySqlSink",
"name": "MySqlSinkConnector_2",
"topics": "transactions",
"input.data.format": "AVRO",
"input.key.format": "STRING",
"kafka.api.key": "'$CLUSTER_KEY'",
"kafka.api.secret": "'$CLUSTER_SECRET'",
"connection.host": "ec2-54-175-153-98.compute-1.amazonaws.com",
"connection.port": "3306",
"connection.user": "kc101user",
"connection.password": "kc101pw",
"db.name": "demo",
"ssl.mode": "prefer",
"pk.mode": "none",
"auto.create": "true",
"auto.evolve": "true",
"tasks.max": "1"
}'

curl --request GET \


--url 'https://api.confluent.cloud/connect/v1/environments/env-57kgoz/clusters/
lkc-vrn9j0/connectors/MySqlSinkConnector_2/status' \
--header 'Authorization: Basic '$CLOUD_AUTH64'' | jq '.'

mysql -u admin -h mysql01.chmkgghlbric.eu-west-3.rds.amazonaws.com -


pyXXqNr3xECF78Ft
https://raw.githubusercontent.com/confluentinc/learn-kafka-courses/main/data-
pipelines/customers.sql | \
mysql -u admin -h mysql01.chmkgghlbric.eu-west-3.rds.amazonaws.com -
pyXXqNr3xECF78Ft

bonjour Je m'appele Happy Wandja Paul


Diplome d'ingénieur Efrei Paris, master big data
Donc les matiere étaient ; Machine learning deep learning Data engineering, cacul
distrubé analyse de données
Au cours de ce cursus j'ai fait 2 stage en data egineer le premier a Numberly 5
mois(socité de marketing digital, j'etais dans l'équipe data engineer qui etait en
charge du data lake , ingestion, extraction , creation Piplines, mises en
production des modèles )
puis une stage en tant que consultant big data Avande (boite conseil qui est une
fusion entre accenture et Microsoft) travailler sur des sujet tels que forecast des
données covid, parallélsation des procédure stocké avec spark, modéle de
recommendation de produit ecommerce (
Ces premier experience ont guidé mon choix de carriere vers la data engineering
d'ou ma premier experience en tant que data engieer a OVH cloud (distributeur de
cloud), les projetct etait sur la migration de nos pipelines, extraction ,
enrichissemnt, structuration de la données, maintenance des workflow de replication
l'un des projet etait chanllengeant étaient la migration (il fallait migrer les
workflow airflow qui etaient principale des job spark , vers un environment plus
normalisé, aujout de la CI/CD,et nouvelle regles d governance

Je ressors de ses experience avec d'excellent connaisance sur les outils tels que
airflow, Hadoop, Base de données NoSQL, Spark(Scala, Python), Bash Shell
Par la suite j'ai une experience SRE chez criteo ou on devait assure la fiabilité
de la platforme (application dans notre scope , application de recommendation,
creatation catalog (spark, kafka, ) capture des events ) a travers
La definition et mise en place du systeme d'alerting (prometheus, grafana) +
Documentation
Améliration de nos service
benckmark systeme de recommendation
outils automatisation : scrapping des config
champs de technologies large, je travailler (reduction de bufdget
oui monter en expertse data

pratiquer, j'ai pense avoir bien repondu a certaine parti , plein d'axe
d'amelioration , kafka
Open data , data payante , construoite des strategie de trading , tres tournée ou
tour de la tech

Une seule équipe data ? sinon les scope ?


Datalake House , 3 personnes (senior)
Tous les 3 aller sur la mise en place CI/CD,
flux données ,
target technologies
client au tour

Curious, excellent, expertise , keeping learning , challenge , space to group,


free d'apprendre

You might also like