+1 vote
I have several HDFS connections. When I write a Hive recipe, I can access the tables from all connections.
But I like to develop and test my SQL code in a Hive notebook first. In this notebook, I can see only the tables from one connection.
asked by

1 Answer

0 votes
Best answer

Good news, all tables can be accessed in a Hive notebook. One simply needs to keep in mind that different connections correspond to different namespaces.

To access table foo from another connection, prefix its name by the database name (for instance “db_bar”):

  SELECT count(*) FROM db_bar.foo

Note that you'll need to remove the prefix when converting to a recipe. The database name is shown above the sql code area: «connected to db_bar (Hive)» and is distinct from the connection name.
You can also find all database names by executing the SQL query

show databases

 

 

Note: you might also check the metastore sync: open the dataset, click settings → advanced → metastore: Synchronize.

answered by
974 questions
1,002 answers
1,049 comments
2,415 users

©Dataiku 2012-2018 - Privacy Policy