Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.reproducibleTempCacheDir shouldn't be getOption(reproducible.cachePath) -- use tempPath #372

Merged
merged 77 commits into from
Nov 6, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
77 commits
Select commit Hold shift + click to select a range
dd32e96
For objects that are saved as paths; convert with asPath
Oct 7, 2023
963d8f5
capture case of unnamed list in CacheDigest
Oct 7, 2023
7f676a3
test
Oct 7, 2023
0511533
[skip-ci]
Oct 7, 2023
aa4bd38
bugfix for failed recovery from cache
Oct 9, 2023
e47d6f3
bugfixes for inputPaths and destinationPath
eliotmcintire Oct 9, 2023
0afe5fe
need unique in different spot
eliotmcintire Oct 10, 2023
e123008
version update for commit 0afe5
ianmseddy Oct 11, 2023
bef992c
Merge branch 'fileBackendFails' into PredictiveEcology/fileBackendFails
ianmseddy Oct 11, 2023
ee16ca4
Merge remote-tracking branch 'origin/IanFBF' into fileBackendFails
achubaty Oct 13, 2023
623acea
Merge branch 'development' into fileBackendFails
achubaty Oct 13, 2023
ca36d58
.reproducibleTempCacheDir shouldn't be getOption(reproducible.cachePa…
eliotmcintire Oct 16, 2023
bc6b260
bugfix -- edge case with `doProgress`
eliotmcintire Oct 17, 2023
07cb8bb
[skip-ci] bump
eliotmcintire Oct 17, 2023
b3e849e
Merge branch 'fileBackendFails' into reproducibleTempCacheDir
Oct 17, 2023
82c9803
browsers
eliotmcintire Oct 17, 2023
f28e913
rm browsers
eliotmcintire Oct 17, 2023
d56f3ef
bugfix
eliotmcintire Oct 17, 2023
c7c7662
[skip-ci] bump 2.0.8.9009
eliotmcintire Oct 17, 2023
b0d36bf
reproducible.useDBI updates: use users's val & set to NULL
eliotmcintire Oct 18, 2023
95691b5
maskInputs backwards compatible tweaks
eliotmcintire Oct 18, 2023
0fc82ff
na.omit when userTags includes an NA
eliotmcintire Oct 18, 2023
9d17132
use terra::wrap for spatVector
eliotmcintire Oct 19, 2023
6534425
clear up messaging a bit
eliotmcintire Oct 19, 2023
570ca41
rm nested Cache formals -- it overwrote important things -- this may …
eliotmcintire Oct 19, 2023
f787ceb
Merge remote-tracking branch 'origin/development' into reproducibleTe…
Oct 19, 2023
753a53a
bump 2.0.8.9010
eliotmcintire Oct 19, 2023
a906c58
remove browser() in unwrapSpatVector()
achubaty Oct 19, 2023
0fb645c
Checksums inside preProcess --> was missing some cases
eliotmcintire Oct 20, 2023
b3a354c
option memoisePersist
eliotmcintire Oct 20, 2023
73a36a2
Deal with terra Certificate fails -- this is bad news
eliotmcintire Oct 20, 2023
18f019c
deal with Certificate failure for terra -- affects all `terra::project`
eliotmcintire Oct 21, 2023
3249f66
Bump 2.0.8.9012
eliotmcintire Oct 21, 2023
9f0a91d
not utils::unwrap lol
eliotmcintire Oct 21, 2023
b24c2e8
preProcess now handles only Google ID; messaging improved
tati-micheletti Oct 24, 2023
8081c89
preProcess now handles only Google ID; messaging improved
tati-micheletti Oct 24, 2023
0ba0d1c
minor Cache message corrections
Oct 24, 2023
a91afff
.robustDigest of `matrix` needs to keep dims
Oct 24, 2023
ffa3075
make all length 1 NA values equal in a digest
Oct 24, 2023
c61ac88
allow objSize to be NA
eliotmcintire Nov 1, 2023
f4ca180
Merge branch 'lala' into reproducibleTempCacheDir
Nov 1, 2023
9a6db36
objSize stuff
eliotmcintire Nov 1, 2023
900c947
allow headless Cache
eliotmcintire Nov 1, 2023
7539054
add preDigest to `.wrap`, with same for other fns upstream
eliotmcintire Nov 3, 2023
5f5f5c8
minor
eliotmcintire Nov 3, 2023
3d96ebe
allow new pathway through postProcessTo uses sf::gdal_utils
eliotmcintire Nov 3, 2023
d8acba7
new fns: isPolygons and isGeomType
eliotmcintire Nov 3, 2023
3dcfd9d
skip pre-crop for polygons -- polygons situation --> creating slivers
eliotmcintire Nov 3, 2023
3889670
fewer warnings in postProcessTo
eliotmcintire Nov 3, 2023
87c7edd
sometimes project makes errors that were not being caught
eliotmcintire Nov 3, 2023
15dbfc0
postProcessTo -- other edge cases for cropTo -- use convex hull
eliotmcintire Nov 3, 2023
0ead8e7
Cache messaging -- minor changes -- add more info
eliotmcintire Nov 3, 2023
001c213
Cache -- edge cases
eliotmcintire Nov 3, 2023
b389ee7
wrapSpatRaster -- new edge cases dealt with (file-backed, subset of l…
eliotmcintire Nov 3, 2023
8c21093
minmaxFn -- setMinMax if not set
eliotmcintire Nov 3, 2023
debbcf3
preProcess -- Checksums tweaks to get more edge cases correct
eliotmcintire Nov 3, 2023
87e5414
preProcess - verbose
eliotmcintire Nov 3, 2023
a4e84fb
preProcess -- Checksums tweaks to get more edge cases correct
eliotmcintire Nov 3, 2023
8fb9518
prepInputs -- verbose updates
eliotmcintire Nov 3, 2023
f8276dd
test-postProcessTo --> minor
eliotmcintire Nov 3, 2023
ca8646b
redoc
eliotmcintire Nov 3, 2023
ebeb12f
unit test for multi-file spatRaster backends
eliotmcintire Nov 3, 2023
6b94326
minor
eliotmcintire Nov 3, 2023
b96998c
Merge branch 'lala2' into reproducibleTempCacheDir
Nov 3, 2023
bbdc4eb
bump v2.0.8.9013
Nov 3, 2023
53ddcf7
updating testing infrastructure
Nov 3, 2023
5822675
things caught by R CMD check
Nov 3, 2023
494b369
Revert "preProcess now handles only Google ID; messaging improved"
Nov 3, 2023
f37ad4e
unit test for Google ID
Nov 3, 2023
0027b69
partial commit from Tati re: google drive id
Nov 3, 2023
d55de8f
add Tati's addition as a comment --> it doesn't work for existing tests
Nov 3, 2023
2c8bfbe
R CMD checking
Nov 3, 2023
859900d
need to match files on cloudDownload
eliotmcintire Nov 5, 2023
74fd541
cloudDownload -- need obj = outputToSave
eliotmcintire Nov 5, 2023
9bcf7f7
cloudDownload -- improve messaging
eliotmcintire Nov 5, 2023
7ca9cd6
mac and linux don't work with ctime the same way as Windows
eliotmcintire Nov 6, 2023
52463c4
clean stale comments
eliotmcintire Nov 6, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,8 @@ SystemRequirements: 'unrar' (Linux/macOS) or '7-Zip' (Windows) to work with '.ra
URL:
https://reproducible.predictiveecology.org,
https://github.com/PredictiveEcology/reproducible
Date: 2023-10-18
Version: 2.0.8.9008
Date: 2023-11-05
Version: 2.0.8.9015
Authors@R:
c(person(given = "Eliot J B",
family = "McIntire",
Expand Down
14 changes: 14 additions & 0 deletions NEWS.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,23 @@
* new function `isUpdated()` to determine whether a cached object has been updated;
* `makeRelative()` is now exported for use downstream (e.g., `SpaDES.core`);
* new functions `getRelative()` and `normPathRel()` for improved symlink handling (#362);
* messaging is improved for `Cache` with the function named instead of just `cacheId`
* messaging for `prepInputs`: minor changes
* more edge cases for `Checksums` dealt with, so fewer unneeded downloads
* `wrapSpatRaster` (`wrap` for file-backed `spatRaster` objects) fixes for more edge cases
* `postProcessTo` can now use `sf::gdal_utils` for the case of `from` is a gridded object and `to` is a polygon vector. This appears to be between 2x and 10x faster in tests.
* `postProcessTo` does a pre-crop (with buffer) to make the `projectTo` faster. When both `from` and `to` are vector objects, this pre-crop appears to create slivers in some cases. This step is now skipped for these cases.
* `Cache` can now deal with unnamed functions, e.g., `Cache((function(x) x)())`
* `terra` would fail if internet was unavailable, even when internet is not necessary, due to needing to retrieve projection information. Many cases where this happens will now divert to use `sf`.
* `Cache` can now skip calculating `objSize`, which can take a non-trivial amount of time for large, complicated objects; see `reproducibleOptions()`

## Bug fixes
* Filenames for some classes returned ""; now returns NULL so character vectors are only pointers to files
* Cache on a terra object that writes file to disk, when `quick` argument is specified was failing, always creating the same object; fixed with #PR368
* `useDBI` was incorrectly used if a user had set the option prior to package loading. Now works as expected.
* several other minor
* `preProcess` deals better with more cases of nested paths in archives.
* more edge cases corrected for `inputPaths`

# reproducible 2.0.8

Expand Down
68 changes: 33 additions & 35 deletions R/DBI.R
Original file line number Diff line number Diff line change
Expand Up @@ -114,8 +114,6 @@ saveToCache <- function(cachePath = getOption("reproducible.cachePath"),

fts <- CacheStoredFile(cachePath, cacheId, obj = obj)

# browser(expr = exists("._saveToCache_2"))

# TRY link first, if there is a linkToCacheId, but some cases will fail; not sure what these cases are
if (!is.null(linkToCacheId)) {
ftL <- CacheStoredFile(cachePath, linkToCacheId)
Expand All @@ -125,8 +123,8 @@ saveToCache <- function(cachePath = getOption("reproducible.cachePath"),
if (is(out, "try-error") || !all((out %in% TRUE))) {
linkToCacheId <- NULL
} else {
messageCache(" (A file with identical properties already exists in the Cache: ", basename(ftL), "; ",
"The newly added (", basename(fts), ") is a file.link to that file)",
messageCache(" (A file with identical properties already exists in the Cache: ", basename(ftL), "; ")
messageCache(" The newly added (", basename(fts), ") is a file.link to that file)",
verbose = verbose
)
}
Expand All @@ -140,8 +138,6 @@ saveToCache <- function(cachePath = getOption("reproducible.cachePath"),
)
if (!useDBI()) {
dtFile <- saveDBFileSingle(dt = dt, cachePath, cacheId)
# dtFile <- CacheDBFileSingle(cachePath = cachePath, cacheId = cacheId)
# saveFilesInCacheFolder(dt, dtFile, cachePath = cachePath, cacheId = cacheId)
} else {
a <- retry(retries = 250, exponentialDecayBase = 1.01, quote(
DBI::dbAppendTable(conn, CacheDBTableName(cachePath, drv), dt)
Expand All @@ -152,12 +148,9 @@ saveToCache <- function(cachePath = getOption("reproducible.cachePath"),
fs <- saveFilesInCacheFolder(cachePath = cachePath, obj, fts, cacheId = cacheId)
}
if (isTRUE(getOption("reproducible.useMemoise"))) {
if (is.null(.pkgEnv[[cachePath]])) {
.pkgEnv[[cachePath]] <- new.env(parent = emptyenv())
}
obj <- .unwrap(obj, cachePath, cacheId, drv, conn) # This takes time, but whether it happens now or later, same
obj2 <- makeMemoisable(obj)
assign(cacheId, obj2, envir = .pkgEnv[[cachePath]])
assign(cacheId, obj2, envir = memoiseEnv(cachePath))
}

fsChar <- as.character(fs)
Expand All @@ -175,9 +168,9 @@ saveToCache <- function(cachePath = getOption("reproducible.cachePath"),
# So effectively, it is like 6x buffer to try to avoid false positives.
whichOS <- which(tagKey == "object.size")
if (length(whichOS)) {
fsBig <- (as.numeric(tagValue[whichOS]) * 4) < fs
objSize <- if (identical(tagValue[whichOS], "NA")) NA else as.numeric(tagValue[whichOS])
fsBig <- (objSize * 4) < fs
if (isTRUE(fsBig)) {
# browser(expr = exists("._saveToCache_3"))
messageCache("Object with cacheId ", cacheId, " appears to have a much larger size ",
"on disk than in memory. ",
"This usually means that the object has captured an environment with ",
Expand All @@ -201,13 +194,14 @@ saveToCache <- function(cachePath = getOption("reproducible.cachePath"),
#' the `cacheId` being loaded or selected
#' @param .dotsFromCache Optional. Used internally.
#' @param .functionName Optional. Used for messaging when this function is called from `Cache`
#' @param preDigest The list of `preDigest` that comes from `CacheDigest` of an object
#' @details
#' `loadFromCache` is a function to get a single object from the cache, given its `cacheId`.
#' @return
#' `loadFromCache` returns the object from the cache that has the particular `cacheId`.
#'
loadFromCache <- function(cachePath = getOption("reproducible.cachePath"),
cacheId,
cacheId, preDigest,
fullCacheTableForObj = NULL,
format = getOption("reproducible.cacheSaveFormat", "rds"),
.functionName = NULL, .dotsFromCache = NULL,
Expand All @@ -224,18 +218,13 @@ loadFromCache <- function(cachePath = getOption("reproducible.cachePath"),

isMemoised <- NA
if (isTRUE(getOption("reproducible.useMemoise"))) {
if (is.null(.pkgEnv[[cachePath]])) {
.pkgEnv[[cachePath]] <- new.env(parent = emptyenv())
}
isMemoised <- exists(cacheId, envir = .pkgEnv[[cachePath]])
isMemoised <- exists(cacheId, envir = memoiseEnv(cachePath))
if (isTRUE(isMemoised)) {
obj <- get(cacheId, envir = .pkgEnv[[cachePath]])
obj <- get(cacheId, envir = memoiseEnv(cachePath))
obj <- unmakeMemoisable(obj)
}
}

# fileFormat <- extractFromCache(fullCacheTableForObj, "fileFormat", ifNot = format)

if (!isTRUE(isMemoised)) {
f <- CacheStoredFile(cachePath, cacheId, format)
f <- unique(f) # It is OK if there is a vector of unique cacheIds e.g., loadFromCache(showCache(userTags = "hi")$cacheId)
Expand All @@ -256,6 +245,7 @@ loadFromCache <- function(cachePath = getOption("reproducible.cachePath"),
cachePath = cachePath, fullCacheTableForObj = fullCacheTableForObj,
cacheId = cacheId,
format = fileExt(sameCacheID),
preDigest = preDigest,
verbose = verbose
)
obj <- .wrap(obj, cachePath = cachePath, drv = drv, conn = conn)
Expand Down Expand Up @@ -291,7 +281,7 @@ loadFromCache <- function(cachePath = getOption("reproducible.cachePath"),

if (isTRUE(getOption("reproducible.useMemoise")) && !isTRUE(isMemoised)) {
obj2 <- makeMemoisable(obj)
assign(cacheId, obj2, envir = .pkgEnv[[cachePath]])
assign(cacheId, obj2, envir = memoiseEnv(cachePath))
}

if (verbose > 3) {
Expand All @@ -315,7 +305,7 @@ loadFromCache <- function(cachePath = getOption("reproducible.cachePath"),


extractFromCache <- function(sc, elem, ifNot = NULL) {
rowNum <- sc[["tagKey"]] == elem
rowNum <- sc[["tagKey"]] %in% elem
elemExtracted <- if (any(rowNum)) {
sc[["tagValue"]][rowNum]
} else {
Expand Down Expand Up @@ -395,7 +385,6 @@ dbConnectAll <- function(drv = getDrv(getOption("reproducible.drv", NULL)),
tagKey = character(), tagValue = character(),
drv = getDrv(getOption("reproducible.drv", NULL)),
conn = getOption("reproducible.conn", NULL)) {
# browser(expr = exists("._addTagsRepo_1"))
if (length(cacheId) > 0) {
if (length(cacheId) > 1) stop(".addTagsRepo can only handle appending 1 tag at a time")
curTime <- as.character(Sys.time())
Expand Down Expand Up @@ -499,14 +488,14 @@ dbConnectAll <- function(drv = getDrv(getOption("reproducible.drv", NULL)),
if (add && alreadyThere == 0) {
dt2 <- rbindlist(list(dt2, dt))
} else {
dt2[tagKey == tk & cacheId == cacheId, tagValue := dt$tagValue]
set(dt2, which(dt2$tagKey == tk & dt2$cacheId == cacheId), "tagValue", dt$tagValue)
# dt2[tagKey == tk & cacheId == cacheId, tagValue := dt$tagValue]
}
saveFilesInCacheFolder(dt2, dtFile, cachePath = cachePath, cacheId = cacheId)
}
}
}
.cacheNumDefaultTags <- function() {
# if (useDBI())
7 # else 12
}

Expand Down Expand Up @@ -571,11 +560,7 @@ CacheDBFile <- function(cachePath = getOption("reproducible.cachePath"),
# }

if (grepl(type, "SQLite")) {
# if (useDBI()) {
file.path(cachePath, "cache.db")
# } else {
# file.path(cachePath, "backpack.db")
# }
} else {
file.path(cachePath, "cache.txt")
}
Expand All @@ -590,11 +575,7 @@ CacheDBFile <- function(cachePath = getOption("reproducible.cachePath"),
#' `CacheStorageDir` returns the name of the directory where cached objects are
#' stored.
CacheStorageDir <- function(cachePath = getOption("reproducible.cachePath")) {
# if (useDBI()) {
file.path(cachePath, "cacheOutputs")
# } # else {
# file.path(cachePath, "gallery")
# }
}

#' @details
Expand All @@ -614,6 +595,7 @@ CacheStorageDir <- function(cachePath = getOption("reproducible.cachePath")) {
CacheStoredFile <- function(cachePath = getOption("reproducible.cachePath"), cacheId,
format = NULL, obj = NULL) {
if (is.null(format)) format <- getOption("reproducible.cacheSaveFormat", "rds")
if (missing(cacheId)) cacheId <- NULL
if (any(format %in% "check")) {
format <- formatCheck(cachePath, cacheId, format)
}
Expand All @@ -629,7 +611,7 @@ CacheStoredFile <- function(cachePath = getOption("reproducible.cachePath"), cac
"rda"
}
}
filename <- paste(cacheId, csExtension, sep = ".")
filename <- if (is.null(cacheId)) NULL else paste(cacheId, csExtension, sep = ".")
if (length(cacheId) > 1) {
filename <- vapply(filename, nextNumericName, FUN.VALUE = character(1))
for (i in seq(filename[-1]) + 1) {
Expand Down Expand Up @@ -958,7 +940,7 @@ convertDBbackendIfIncorrect <- function(cachePath, drv, conn,
newDBI <- suppressMessages(useDBI(!origDBI)) # switch to the other
if (!identical(newDBI, origDBI)) { # if they are same, then DBI is not installed; not point proceeding
on.exit(suppressMessages(useDBI(origDBI)))
drv <- getDrv(drv)
drv <- getDrv(drv) # This will return the DBI driver, if it is installed, regardless of drv
DBFileWrong <- CacheDBFile(cachePath, drv, conn)
if (file.exists(DBFileWrong)) {
sc <- showCache(cachePath, drv = drv, conn = conn, verbose = -2)
Expand Down Expand Up @@ -1004,3 +986,19 @@ CacheDBFiles <- function(cachePath = getOption("reproducible.cachePath")) {
dtFiles <- dir(CacheStorageDir(cachePath), pattern = ext, full.names = TRUE)
dtFiles
}

memoiseEnv <- function(cachePath, envir = .GlobalEnv) {
memPersist <- isTRUE(getOption("reproducible.memoisePersist", NULL))
if (memPersist) {
obj <- paste0(".reproducibleMemoise_", cachePath)
if (!exists(obj, envir = envir))
assign(obj, new.env(parent = emptyenv()), envir = envir)
memEnv <- get(obj, envir = envir, inherits = FALSE)
} else {
if (is.null(.pkgEnv[[cachePath]])) {
.pkgEnv[[cachePath]] <- new.env(parent = emptyenv())
}
memEnv <- .pkgEnv[[cachePath]]
}
memEnv
}
34 changes: 19 additions & 15 deletions R/cache-helpers.R
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,6 @@ setAs(from = "character", to = "Path", function(from) {
#'
copySingleFile <- function(from = NULL, to = NULL, useRobocopy = TRUE,
overwrite = TRUE, delDestination = FALSE,
# copyRasterFile = TRUE, clearRepo = TRUE,
create = TRUE, silent = FALSE) {
if (any(length(from) != 1, length(to) != 1)) stop("from and to must each be length 1")
useFileCopy <- identical(dirname(from), dirname(to))
Expand Down Expand Up @@ -355,8 +354,6 @@ copyFile <- Vectorize(copySingleFile, vectorize.args = c("from", "to"))
obj
}

# loadFromLocalRepoMem <- memoise::memoise(loadFromLocalRepo)

#' @keywords internal
.getOtherFnNamesAndTags <- function(scalls) {
if (is.null(scalls)) {
Expand Down Expand Up @@ -556,22 +553,29 @@ withoutFinalNumeric <- function(string) {
setClass("PackedSpatExtent")

wrapSpatVector <- function(obj) {
geom1 <- terra::geom(obj)
geom1 <- list(
cols125 = matrix(as.integer(geom1[, c(1, 2, 5)]), ncol = 3),
cols34 = matrix(as.integer(geom1[, c(3, 4)]), ncol = 2)
)
geomtype1 <- terra::geomtype(obj)
dat1 <- terra::values(obj)
crs1 <- terra::crs(obj)
obj <- list(geom1, geomtype1, dat1, crs1)
names(obj) <- spatVectorNamesForCache
obj <- terra::wrap(obj)
if (FALSE) {
geom1 <- terra::geom(obj)
geom1 <- list(
cols125 = matrix(as.integer(geom1[, c(1, 2, 5)]), ncol = 3),
cols34 = matrix(as.integer(geom1[, c(3, 4)]), ncol = 2)
)
geomtype1 <- terra::geomtype(obj)
dat1 <- terra::values(obj)
crs1 <- terra::crs(obj)
obj <- list(geom1, geomtype1, dat1, crs1)
names(obj) <- spatVectorNamesForCache
}
obj
}

unwrapSpatVector <- function(obj) {
obj$x <- cbind(obj$x$cols125[, 1:2, drop = FALSE], obj$x$cols34[, 1:2, drop = FALSE], obj$x$cols125[, 3, drop = FALSE])
do.call(terra::vect, obj)
obj <- terra::unwrap(obj)
if (FALSE) {
obj$x <- cbind(obj$x$cols125[, 1:2, drop = FALSE], obj$x$cols34[, 1:2, drop = FALSE], obj$x$cols125[, 3, drop = FALSE])
do.call(terra::vect, obj)
}
obj
}

#' Has a cached object has been updated?
Expand Down
9 changes: 3 additions & 6 deletions R/cache-internals.R
Original file line number Diff line number Diff line change
Expand Up @@ -13,13 +13,12 @@
)

hashObjectSize <- unlist(lapply(modifiedDots, function(x) {
objSize <- unname(attr(objSize(x), "objSize"))
if (getOption("reproducible.objSize", TRUE)) unname(attr(objSize(x), "objSize")) else NA
}))

lengths <- unlist(lapply(preDigestUnlist, function(x) length(unlist(x))))
hashDetails <- data.frame(
objectNames = rep(names(preDigestUnlist), lengths),
# objSize = rep(hashObjectSize, lengths),
hashElements = names(unlist(preDigestUnlist)),
hash = unname(unlist(preDigestUnlist)),
stringsAsFactors = FALSE
Expand All @@ -32,8 +31,6 @@
strsplit(names(hashObjectSize), split = "\\$"),
function(x) paste0(tail(x, 2), collapse = ".")
))
# hashObjectSizeNames <- unlist(lapply(strsplit(hashObjectSizeNames, split = "\\.y"),
# function(x) paste0(tail(x, 2), collapse = ".")))
hashObjectSizeNames <- gsub("\\.y", replacement = "", hashObjectSizeNames)
hashObjectSizeNames <- unlist(lapply(
strsplit(hashObjectSizeNames, split = "\\."),
Expand Down Expand Up @@ -108,7 +105,7 @@
.getFromRepo <- function(FUN, isInRepo, fullCacheTableForObj,
notOlderThan, lastOne, cachePath, fnDetails,
modifiedDots, debugCache, verbose, # sideEffect,
quick, fileFormat = NULL,
quick, # fileFormat = NULL,
algo, preDigest, startCacheTime,
drv = getDrv(getOption("reproducible.drv", NULL)),
conn = getOption("reproducible.conn", NULL), ...) {
Expand All @@ -118,7 +115,7 @@
output <- loadFromCache(cachePath, isInRepo[[.cacheTableHashColName()[lastOne]]],
fullCacheTableForObj = fullCacheTableForObj,
# format = fileFormat, loadFun = loadFun,
.functionName = fnDetails$functionName, .dotsFromCache = modifiedDots,
.functionName = fnDetails$functionName, preDigest = preDigest, .dotsFromCache = modifiedDots,
drv = drv, conn = conn,
verbose = verbose
)
Expand Down
Loading
Loading