---
title: "one_vs_group"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{one_vs_group}
%\VignetteEngine{knitr::rmarkdown}
%\VignetteEncoding{UTF-8}
---
```{r, include = FALSE}
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>"
)
```
```{r eval=TRUE, echo=FALSE}
library(neuroSCC)
```
# Introduction
This vignette illustrates a minimal example of a **single patient vs. group** (1 vs Group) SCC analysis with the `neuroSCC` package.
Ensure you have executed the steps from the ["Getting Started"](landing_vignette.html) vignette, or follow the conditional steps below to create required data.
---
# Data Preparation (Conditional)
If you haven't executed the previous vignette, let's conditionally recreate necessary matrices.
```{r}
cat("Creating required matrices from sample data...\n")
databaseControls <- databaseCreator(pattern = "^syntheticControl.*\\.nii\\.gz$", control = TRUE, quiet = TRUE)
databasePathological <- databaseCreator(pattern = "^syntheticPathological1\\.nii\\.gz$", control = FALSE, quiet = TRUE)
matrixControls <- matrixCreator(databaseControls, paramZ = 35, quiet = TRUE)
matrixPathological <- matrixCreator(databasePathological, paramZ = 35, quiet = TRUE)
normalizedMatrix <- meanNormalization(matrixControls)
normalizedPathological <- meanNormalization(matrixPathological)
niftiPath <- system.file("extdata", "syntheticControl1.nii.gz", package = "neuroSCC")
contours <- neuroContour(niftiPath, paramZ = 35, levels = c(0), plotResult = FALSE)
cat("Data prepared successfully.\n")
```
---
# Single Patient Data and Cloning
Prepare data for a single patient and generate synthetic Poisson clones to enable SCC computation:
```{r}
# Select the single pathological patient
SCC_AD <- normalizedPathological[1, , drop = FALSE]
SCC_CN <- normalizedMatrix
# Toy parameters for Poisson clone generation (modifiable by user)
numClones <- 2
factorLambda <- 0.1
# Generate synthetic clones
SCC_AD_clones <- generatePoissonClones(SCC_AD, numClones, factorLambda)
# Combine the patient with generated clones
SCC_AD_expanded <- rbind(SCC_AD, SCC_AD_clones)
SCC_AD_expanded <- meanNormalization(SCC_AD_expanded)
```
---
# SCC Estimation (Conditional Execution)
This step requires the external `ImageSCC` package, currently not on CRAN. Ensure you have this package installed. If not, install it using:
```{r}
if (requireNamespace("ImageSCC", quietly = TRUE)) {
message("'ImageSCC' package is available.")
} else {
message("This vignette requires the 'ImageSCC' package.")
message("You can install it from GitHub with:")
message(" remotes::install_github('FIRST-Data-Lab/ImageSCC')")
}
```
The SCC computation is typically computationally intensive. Here, we skip actual computation and load a precomputed result from the package (`SCCcomp.RData`):
```{r eval = requireNamespace("ImageSCC", quietly = TRUE), echo=TRUE}
# Check for Triangulation package
triangulation_available <- requireNamespace("Triangulation", quietly = TRUE)
if (triangulation_available) {
message("'Triangulation' package is available.")
} else {
message("'Triangulation' package not available.")
message("Install with: remotes::install_github('FIRST-Data-Lab/Triangulation')")
}
# Proceed only if both packages are available
if (triangulation_available) {
# Try loading precomputed SCC object from data/
if (!exists("SCCcomp", inherits = FALSE) &&
"SCCcomp" %in% data(package = "neuroSCC")$results[, "Item"]) {
message("Loading precomputed SCC object from package data...")
suppressMessages(data("SCCcomp", package = "neuroSCC"))
} else if (!exists("SCCcomp", inherits = FALSE)) {
message("Precomputed object not found. Running SCC estimation...")
# 1. Prepare contour and triangulation
Z <- as.matrix(contours[[1]][, c("x", "y")])
VT <- Triangulation::TriMesh(contours[[1]], n = 15)
V <- as.matrix(VT[[1]])
Tr <- as.matrix(VT[[2]])
# 2. Run SCC estimation
SCCcomp <- ImageSCC::scc.image(
Ya = SCC_AD_expanded,
Yb = SCC_CN,
Z = Z,
d.est = 5,
d.band = 2,
r = 1,
V.est.a = V,
Tr.est.a = Tr,
V.band.a = V,
Tr.band.a = Tr,
penalty = TRUE,
lambda = 10^{seq(-6, 3, 0.5)},
alpha.grid = c(0.10, 0.05, 0.01),
adjust.sigma = TRUE
)
}
}
```
---
# Extracting Significant Points and Evaluating Metrics
Once the SCC computation results (`SCCcomp`) are available, we can extract significant points and compute relevant performance metrics:
```{r eval=TRUE}
# Extract significant points
significantPoints <- getPoints(SCCcomp)
# Load example true ROI data from the package
roi_path <- system.file("extdata", "ROIsample_Region2_18.nii.gz", package = "neuroSCC")
trueROI <- processROIs(roi_path, region = "Region2", number = "18", save = FALSE)
# Get dimensions for total coordinates
dimensions <- getDimensions(roi_path)
totalCoords <- expand.grid(x = 1:dimensions$xDim, y = 1:dimensions$yDim)
# Calculate performance metrics
metrics <- calculateMetrics(
detectedPoints = significantPoints$positivePoints,
truePoints = trueROI,
totalCoords = dimensions,
regionName = "SinglePatient_vs_Group"
)
# Display metrics
print(metrics)
```
---
# Conclusion
This vignette provides a reproducible example of conducting a **single patient vs group SCC analysis** using `neuroSCC`. Due to computational constraints, actual SCC computations are skipped and precomputed data is loaded instead. Users are encouraged to adapt parameters and explore further on their datasets.