i want to list all the tables in every database in Azure Databricks.
so i want the output to look somewhat like this:
Database | Table_name
Database1 | Table_1
Database1 | Table_2
Database1 | Table_3
Database2 | Table_1
etc..
This is what i have at the moment:
from pyspark.sql.types import *DatabaseDF = spark.sql(f"show databases")
df = spark.sql(f"show Tables FROM {DatabaseDF}")
#df = df.select("databaseName")
#list = [x["databaseName"] for x in df.collect()]print(DatabaseDF)
display(DatabaseDF)df = spark.sql(f"show Tables FROM {schemaName}")
df = df.select("TableName")
list = [x["TableName"] for x in df.collect()]## Iterate through list of schema
for x in list:
### INPUT Required: Change for target tabletempTable = xdf2 = spark.sql(f"SELECT COUNT(*) FROM {schemaName}.{tempTable}").collect()for x in df2:rowCount = x[0]if rowCount == 0:print(schemaName + "." + tempTable + " has 0 rows")
but i'm not quite getting the results.