Skip to content

Commit 661d5df

Browse files
authored
[TST] Avoid listing datasets and tables in system tests (#216)
* [TST] Avoid listing datasets and tables in system tests Most tests only use a single dataset, so it doesn't make sense to list datasets in the clean-up method. Also, listing datasets is eventually-consistent, so avoiding listing when it isn't necessary makes the tests less flaky. Removes unused _Dataset.tables() and _Dataset.datasets() methods. * Remove unused randint import. * Update changelog.
1 parent d923dc8 commit 661d5df

File tree

3 files changed

+307
-429
lines changed

3 files changed

+307
-429
lines changed

docs/source/changelog.rst

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,18 @@ Changelog
88

99
- Add :class:`pandas_gbq.Context` to cache credentials in-memory, across
1010
calls to ``read_gbq`` and ``to_gbq``. (:issue:`198`, :issue:`208`)
11-
- Fast queries now do not log above ``DEBUG`` level. (:issue:`204`).
11+
- Fast queries now do not log above ``DEBUG`` level. (:issue:`204`)
1212
With BigQuery's release of `clustering <https://cloud.google.com/bigquery/docs/clustered-tables>`__
1313
querying smaller samples of data is now faster and cheaper.
14+
- Don't load credentials from disk if reauth is ``True``. (:issue:`212`)
15+
This fixes a bug where pandas-gbq could not refresh credentials if the
16+
cached credentials were invalid, revoked, or expired, even when
17+
``reauth=True``.
18+
19+
Internal changes
20+
~~~~~~~~~~~~~~~~
21+
22+
- Avoid listing datasets and tables in system tests. (:issue:`215`)
1423

1524
.. _changelog-0.6.1:
1625

pandas_gbq/gbq.py

Lines changed: 0 additions & 79 deletions
Original file line numberDiff line numberDiff line change
@@ -1088,32 +1088,6 @@ def exists(self, dataset_id):
10881088
except self.http_error as ex:
10891089
self.process_http_error(ex)
10901090

1091-
def datasets(self):
1092-
""" Return a list of datasets in Google BigQuery
1093-
1094-
Parameters
1095-
----------
1096-
None
1097-
1098-
Returns
1099-
-------
1100-
list
1101-
List of datasets under the specific project
1102-
"""
1103-
1104-
dataset_list = []
1105-
1106-
try:
1107-
dataset_response = self.client.list_datasets()
1108-
1109-
for row in dataset_response:
1110-
dataset_list.append(row.dataset_id)
1111-
1112-
except self.http_error as ex:
1113-
self.process_http_error(ex)
1114-
1115-
return dataset_list
1116-
11171091
def create(self, dataset_id):
11181092
""" Create a dataset in Google BigQuery
11191093
@@ -1135,56 +1109,3 @@ def create(self, dataset_id):
11351109
self.client.create_dataset(dataset)
11361110
except self.http_error as ex:
11371111
self.process_http_error(ex)
1138-
1139-
def delete(self, dataset_id):
1140-
""" Delete a dataset in Google BigQuery
1141-
1142-
Parameters
1143-
----------
1144-
dataset : str
1145-
Name of dataset to be deleted
1146-
"""
1147-
from google.api_core.exceptions import NotFound
1148-
1149-
if not self.exists(dataset_id):
1150-
raise NotFoundException(
1151-
"Dataset {0} does not exist".format(dataset_id)
1152-
)
1153-
1154-
try:
1155-
self.client.delete_dataset(self.client.dataset(dataset_id))
1156-
1157-
except NotFound:
1158-
# Ignore 404 error which may occur if dataset already deleted
1159-
pass
1160-
except self.http_error as ex:
1161-
self.process_http_error(ex)
1162-
1163-
def tables(self, dataset_id):
1164-
""" List tables in the specific dataset in Google BigQuery
1165-
1166-
Parameters
1167-
----------
1168-
dataset : str
1169-
Name of dataset to list tables for
1170-
1171-
Returns
1172-
-------
1173-
list
1174-
List of tables under the specific dataset
1175-
"""
1176-
1177-
table_list = []
1178-
1179-
try:
1180-
table_response = self.client.list_tables(
1181-
self.client.dataset(dataset_id)
1182-
)
1183-
1184-
for row in table_response:
1185-
table_list.append(row.table_id)
1186-
1187-
except self.http_error as ex:
1188-
self.process_http_error(ex)
1189-
1190-
return table_list

0 commit comments

Comments
 (0)