Use Python to remotely connect to a k8s pod and send commands on that pod
Question:
I have a scenario where if I want to run a postgres query or check a kafka topic my only option is to connect to a pod that has those privileges and from there either run the relevant kafka command or in the case of a db query I need to connect and then run the query.
Is there a ‘somewhat’ elegant way of wrapping these fairly simple commands in a python app? It would be something akin to pipeing several commands but I’m not sure of how to make this work as so far attempts have failed.
My use cases are similar to:
- kubectl exec -it pod bash
- run kafka command
or
- kubectl exec -it pod bash – same pod as above
- connect to postgres db
- run query
Thanks in advance for any help, a pointer in the right direction would be fantastic as relatively new to python.
Answers:
Brainstorming Idea: You can wrap commands in one Python file and run it on the K8s master node:
import subprocess
def run_kafka_command(pod_name, kafka_command):
command = f'kubectl exec -it {pod_name} -- bash -c "{kafka_command}"'
result = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
return result.stdout
def run_postgres_query(pod_name, database, query):
command = f'kubectl exec -it {pod_name} -- psql -d {database} -c "{query}"'
result = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
return result.stdout
if __name__ == "__main__":
pod_name = "your_pod_name"
kafka_command = "your_kafka_command"
database = "your_database"
query = "your_query"
kafka_output = run_kafka_command(pod_name, kafka_command)
print("Kafka Command Output:")
print(kafka_output)
postgres_output = run_postgres_query(pod_name, database, query)
print("Postgres Query Output:")
print(postgres_output)
I have a scenario where if I want to run a postgres query or check a kafka topic my only option is to connect to a pod that has those privileges and from there either run the relevant kafka command or in the case of a db query I need to connect and then run the query.
Is there a ‘somewhat’ elegant way of wrapping these fairly simple commands in a python app? It would be something akin to pipeing several commands but I’m not sure of how to make this work as so far attempts have failed.
My use cases are similar to:
- kubectl exec -it pod bash
- run kafka command
or
- kubectl exec -it pod bash – same pod as above
- connect to postgres db
- run query
Thanks in advance for any help, a pointer in the right direction would be fantastic as relatively new to python.
Brainstorming Idea: You can wrap commands in one Python file and run it on the K8s master node:
import subprocess
def run_kafka_command(pod_name, kafka_command):
command = f'kubectl exec -it {pod_name} -- bash -c "{kafka_command}"'
result = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
return result.stdout
def run_postgres_query(pod_name, database, query):
command = f'kubectl exec -it {pod_name} -- psql -d {database} -c "{query}"'
result = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
return result.stdout
if __name__ == "__main__":
pod_name = "your_pod_name"
kafka_command = "your_kafka_command"
database = "your_database"
query = "your_query"
kafka_output = run_kafka_command(pod_name, kafka_command)
print("Kafka Command Output:")
print(kafka_output)
postgres_output = run_postgres_query(pod_name, database, query)
print("Postgres Query Output:")
print(postgres_output)