You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The container runs with the commandline arg set as expected:
bash-4.4# ps -ef
UID PID PPID C STIME TTY TIME CMD
1000 1 0 0 03:39 ? 00:00:00 /oracledb_exporter --scrape.interval=15s
Have metric in default-metrics.toml defined as below:
[[metric]]
context = "tablespace"
labels = [ "tablespace", "type" ]
metricsdesc = { bytes = "Generic counter metric of tablespaces bytes in Oracle.", max_bytes = "Generic counter metric of tablespaces max bytes in Oracle.", free = "Generic counter metric of tablespaces free bytes in Oracle.", used_percent = "Gauge metric showing as a percentage of how much of the tablespace has been used." }
request = '''
SELECT
dt.tablespace_name as tablespace,
dt.contents as type,
dt.block_size * dtum.used_space as bytes,
dt.block_size * dtum.tablespace_size as max_bytes,
dt.block_size * (dtum.tablespace_size - dtum.used_space) as free,
dtum.used_percent
FROM dba_tablespace_usage_metrics dtum, dba_tablespaces dt
WHERE dtum.tablespace_name = dt.tablespace_name
and dt.contents != 'TEMPORARY'
union
SELECT
dt.tablespace_name as tablespace,
'TEMPORARY' as type,
dt.tablespace_size - dt.free_space as bytes,
dt.tablespace_size as max_bytes,
dt.free_space as free,
((dt.tablespace_size - dt.free_space) / dt.tablespace_size)
FROM dba_temp_free_space dt
order by tablespace
'''
scrapeinterval = "2m"
However, the tablespace metric never appears to get scraped. Other metrics without a scrapinterval set get scraped (on request) as expected.
The text was updated successfully, but these errors were encountered:
Have deployed exporter using Docker Compose and passed --scrape.interval=15s as below:
The container runs with the commandline arg set as expected:
Have metric in default-metrics.toml defined as below:
However, the tablespace metric never appears to get scraped. Other metrics without a scrapinterval set get scraped (on request) as expected.
The text was updated successfully, but these errors were encountered: