Skip to content

Commit

Permalink
Restore GHA on Windows (#6022)
Browse files Browse the repository at this point in the history
* Use early exit to get auto-print output for 'main' branch

* Restore GHA on Windows

* Remove Appveyor config

* Amend other references to Appveyor

* Try setup-pandoc

* restore devel on windows

* setup-pandoc not needed
  • Loading branch information
MichaelChirico authored Apr 3, 2024
1 parent 9d73cf2 commit d35dceb
Show file tree
Hide file tree
Showing 8 changed files with 5 additions and 83 deletions.
1 change: 0 additions & 1 deletion .Rbuildignore
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@
^\.graphics$
^\.github$

^\.appveyor\.yml$
^\.gitlab-ci\.yml$

^Makefile$
Expand Down
71 changes: 0 additions & 71 deletions .appveyor.yml

This file was deleted.

4 changes: 0 additions & 4 deletions .ci/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,10 +35,6 @@ Artifacts:

TODO document

### [Appveyor](./../.appveyor.yml)

TODO document

## CI tools

### [`ci.R`](./ci.R)
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/R-CMD-check.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ jobs:
# Rdatatable has full-strength GLCI which runs after merge. So we just need a few
# jobs (mainly test-coverage) to run on every commit in PRs so as to not slow down dev.
# GHA does run these jobs concurrently but even so reducing the load seems like a good idea.
# - {os: windows-latest, r: 'release'} # currently using AppVeyor which runs 32bit in 5 min and works
- {os: windows-latest, r: 'devel'}
# - {os: macOS-latest, r: 'release'} # test-coverage.yaml uses macOS
- {os: ubuntu-20.04, r: 'release', rspm: "https://packagemanager.rstudio.com/cran/__linux__/focal/latest"}
# - {os: ubuntu-20.04, r: 'devel', rspm: "https://packagemanager.rstudio.com/cran/__linux__/focal/latest", http-user-agent: "R/4.1.0 (ubuntu-20.04) R (4.1.0 x86_64-pc-linux-gnu x86_64 linux-gnu) on GitHub Actions" }
Expand Down
1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@
<!-- badges: start -->
[![CRAN status](https://badges.cranchecks.info/flavor/release/data.table.svg)](https://cran.r-project.org/web/checks/check_results_data.table.html)
[![R-CMD-check](https://github.com/Rdatatable/data.table/workflows/R-CMD-check/badge.svg)](https://github.com/Rdatatable/data.table/actions)
[![AppVeyor build status](https://ci.appveyor.com/api/projects/status/kayjdh5qtgymhoxr/branch/master?svg=true)](https://ci.appveyor.com/project/Rdatatable/data-table)
[![Codecov test coverage](https://codecov.io/github/Rdatatable/data.table/coverage.svg?branch=master)](https://app.codecov.io/github/Rdatatable/data.table?branch=master)
[![GitLab CI build status](https://gitlab.com/Rdatatable/data.table/badges/master/pipeline.svg)](https://gitlab.com/Rdatatable/data.table/-/pipelines)
[![downloads](https://cranlogs.r-pkg.org/badges/data.table)](https://www.rdocumentation.org/trends)
Expand Down
2 changes: 1 addition & 1 deletion inst/tests/benchmark.Rraw
Original file line number Diff line number Diff line change
Expand Up @@ -348,7 +348,7 @@ ans <- melt(dt, measure.vars=names(dt), na.rm=TRUE)
test(1035.21, ans, ans)

# gc race with altrep in R-devel May 2018, #2866 & #2767, PR#2882
# This runs with 2 threads in the test suite on CRAN and AppVeyor etc.
# This runs with 2 threads in the test suite on CRAN and GHA etc.
# 2 threads are sufficient to fail before the fix.
N = 20
DF = data.frame(a=rnorm(N),
Expand Down
5 changes: 2 additions & 3 deletions inst/tests/tests.Rraw
Original file line number Diff line number Diff line change
Expand Up @@ -3051,9 +3051,8 @@ x = sample(1:1000,2100,replace=TRUE) # 2100 > 100 JUMPLINES * 10 NJUMP * 2 spac
DT = data.table( A=as.character(x), B=1:100)
DT[115, A:="123456789123456"] # row 115 is outside the 100 rows at 10 points.
fwrite(DT,f<-tempfile())
test(1016.1, sapply(suppressWarnings(fread(f,verbose=TRUE)),"class"), c(A="integer64", B="integer"),
test(1016.1, sapply(fread(f,verbose=TRUE),"class"), c(A="integer64", B="integer"),
output="Rereading 1 columns.*Column 1.*A.*bumped.*int32.*int64.*<<123456789123456>>")
# suppressWarnings for 'bit64 is not installed' warning on AppVeyor where we (correctly) don't install Suggests
test(1016.2, fread(f, colClasses = c(A="numeric"), verbose=TRUE), copy(DT)[,A:=as.numeric(A)], output="Rereading 0 columns")
DT[90, A:="321456789123456"] # inside the sample
write.table(DT,f,sep=",",row.names=FALSE,quote=FALSE)
Expand Down Expand Up @@ -8354,7 +8353,7 @@ test(1590.07, identical(baseR, INT(1,4,2,3)) || identical(baseR, INT(2,3,1,4)) |
Sys.setlocale("LC_CTYPE", ctype)
Sys.setlocale("LC_COLLATE", collate)
test(1590.08, Sys.getlocale(), oldlocale) # checked restored locale fully back to how it was before this test
# Now test default locale on all platforms: Windows-1252 on AppVeyor and win-builder, UTF-8 on Linux, and users running test.data.table() in their locale
# Now test default locale on all platforms: Windows-1252 on GHA and win-builder, UTF-8 on Linux, and users running test.data.table() in their locale
x1 = "fa\xE7ile"
Encoding(x1) = "latin1"
x2 = iconv(x1, "latin1", "UTF-8")
Expand Down
2 changes: 1 addition & 1 deletion src/snprintf.c
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
// For translations (#4402) we need positional specifiers (%n$), a non-C99 POSIX extension.
// On Linux and Mac, standard snprintf supports positional specifiers.
// On Windows, we tried many things but just couldn't achieve linking to _sprintf_p. Even
// if we managed that on AppVeyor we may have fragility in the future on Windows given
// if we managed that on AppVeyor (now GHA) we may have fragility in the future on Windows given
// varying Windows versions, compile environments/flags, and dll libraries. This may be
// why R uses a third party library, trio, on Windows. But R does not expose trio for use
// by packages.
Expand Down

0 comments on commit d35dceb

Please sign in to comment.