Skip to contents

Introduction

This vignette provides a detailed guide to the fofr_bayes() function in the refundBayes package, which fits Bayesian Function-on-Function Regression (FoFR) models using Stan. FoFR extends both the scalar-on-function regression (SoFR) and function-on-scalar regression (FoSR) frameworks by modeling a functional response as a function of one or more functional predictors, along with optional scalar covariates.

In contrast to SoFR, where the outcome is scalar and the predictors are functional, and to FoSR, where the outcome is functional and the predictors are scalar, FoFR allows the effect of a functional predictor on a functional response to be captured through a bivariate coefficient function β(s,t)\beta(s, t). The model is specified using the same mgcv-style syntax as the other regression functions in the refundBayes package.

The methodology extends the framework described in Jiang, Crainiceanu, and Cui (2025), Tutorial on Bayesian Functional Regression Using Stan, published in Statistics in Medicine.

Install the refundBayes Package

The refundBayes package can be installed from GitHub:

library(remotes)
remotes::install_github("https://github.com/ZirenJiang/refundBayes")

Statistical Model

The FoFR Model

Function-on-Function Regression (FoFR) models the relationship between a functional response and one or more functional predictors, optionally adjusted for scalar covariates. Both the response and the predictors are curves observed over (potentially different) continua.

For subject i=1,,ni = 1, \ldots, n, let Yi(t)Y_i(t) be the functional response observed at time points t=t1,,tMt = t_1, \ldots, t_M over a response domain 𝒯\mathcal{T}, and let {Wi(sl),sl𝒮}\{W_i(s_l), s_l \in \mathcal{S}\} for l=1,,Ll = 1, \ldots, L be a functional predictor observed at LL points over a predictor domain 𝒮\mathcal{S}. Let 𝐗i=(Xi1,,XiP)t\mathbf{X}_i = (X_{i1}, \ldots, X_{iP})^t be a P×1P \times 1 vector of scalar predictors (the first covariate may be an intercept, Xi1=1X_{i1} = 1). The FoFR model assumes:

Yi(t)=p=1PXipαp(t)+𝒮Wi(s)β(s,t)ds+ei(t)Y_i(t) = \sum_{p=1}^P X_{ip}\,\alpha_p(t) + \int_{\mathcal{S}} W_i(s)\,\beta(s, t)\,ds + e_i(t)

where:

  • αp(t)\alpha_p(t) are functional coefficients for the scalar predictors, each describing how the pp-th scalar predictor affects the response at each point t𝒯t \in \mathcal{T},
  • β(s,t)\beta(s, t) is the bivariate coefficient function that characterizes how the functional predictor at predictor-domain location ss affects the response at response-domain location tt,
  • ei(t)e_i(t) is the residual process, which is correlated across tt.

The integral 𝒮Wi(s)β(s,t)ds\int_{\mathcal{S}} W_i(s)\,\beta(s, t)\,ds is approximated using a Riemann sum over the observed predictor-domain grid points. The domains 𝒮\mathcal{S} and 𝒯\mathcal{T} are not restricted to [0,1][0,1]; they are determined by the actual observation grids in the data.

When multiple functional predictors are present, the model extends naturally:

Yi(t)=p=1PXipαp(t)+j=1J𝒮jWij(s)βj(s,t)ds+ei(t)Y_i(t) = \sum_{p=1}^P X_{ip}\,\alpha_p(t) + \sum_{j=1}^J \int_{\mathcal{S}_j} W_{ij}(s)\,\beta_j(s, t)\,ds + e_i(t)

where βj(s,t)\beta_j(s, t) is the bivariate coefficient function for the jj-th functional predictor.

Modeling the Residual Structure

To account for the within-subject correlation in the residuals, the model decomposes ei(t)e_i(t) using functional principal components, following the same approach as in FoSR (Goldsmith, Zipunnikov, and Schrack, 2015):

ei(t)=r=1Rξirϕr(t)+ϵi(t)e_i(t) = \sum_{r=1}^R \xi_{ir}\,\phi_r(t) + \epsilon_i(t)

where ϕ1(t),,ϕR(t)\phi_1(t), \ldots, \phi_R(t) are the eigenfunctions estimated via FPCA (using refund::fpca.face), ξir\xi_{ir} are the subject-specific FPCA scores with ξirN(0,λr)\xi_{ir} \sim N(0, \lambda_r), and ϵi(t)N(0,σϵ2)\epsilon_i(t) \sim N(0, \sigma_\epsilon^2) is independent measurement error. The eigenvalues λr\lambda_r and the error variance σϵ2\sigma_\epsilon^2 are estimated from the data.

Scalar Predictor Coefficients via Penalized Splines

Each scalar predictor coefficient function αp(t)\alpha_p(t) is represented using KK spline basis functions ψ1(t),,ψK(t)\psi_1(t), \ldots, \psi_K(t) in the response domain:

αp(t)=k=1Kapkψk(t)\alpha_p(t) = \sum_{k=1}^K a_{pk}\,\psi_k(t)

Smoothness is induced through a quadratic penalty:

p(𝐚p)exp(𝐚pt𝐒𝐚p2σp2)p(\mathbf{a}_p) \propto \exp\left(-\frac{\mathbf{a}_p^t \mathbf{S} \mathbf{a}_p}{2\sigma_p^2}\right)

where 𝐒\mathbf{S} is the penalty matrix derived from the spline basis and σp2\sigma_p^2 is the smoothing parameter for the pp-th scalar predictor, estimated from the data.

Bivariate Coefficient via Tensor Product Basis

The key feature of FoFR is the bivariate coefficient function β(s,t)\beta(s, t), which lives on the product domain 𝒮×𝒯\mathcal{S} \times \mathcal{T}. This function is represented using a tensor product of two sets of basis functions:

  • Predictor-domain basis: B1(s),,BQ(s)B_1(s), \ldots, B_Q(s), constructed from the s() term in the formula using the mgcv spline basis (e.g., cubic regression splines). These are further decomposed into random and fixed effect components via the spectral reparametrisation of mgcv::smooth2random(), yielding B̃1r(s),,B̃Qrr(s)\tilde{B}_1^r(s), \ldots, \tilde{B}_{Q_r}^r(s) (random effects) and B̃1f(s),,B̃Qff(s)\tilde{B}_1^f(s), \ldots, \tilde{B}_{Q_f}^f(s) (fixed effects).

  • Response-domain basis: ψ1(t),,ψK(t)\psi_1(t), \ldots, \psi_K(t), the same spline basis used for the scalar predictor coefficients.

The bivariate coefficient is then:

β(s,t)=k=1K[q=1QrθqkrB̃qr(s)+q=1QfθqkfB̃qf(s)]ψk(t)\beta(s, t) = \sum_{k=1}^K \left[\sum_{q=1}^{Q_r} \theta_{qk}^r\,\tilde{B}_q^r(s) + \sum_{q=1}^{Q_f} \theta_{qk}^f\,\tilde{B}_q^f(s)\right] \psi_k(t)

where 𝚯r\boldsymbol{\Theta}^r is the Qr×KQ_r \times K matrix of random effect coefficients and 𝚯f\boldsymbol{\Theta}^f is the Qf×KQ_f \times K matrix of fixed effect coefficients.

With this representation, the integral contribution of the functional predictor becomes:

𝒮Wi(s)β(s,t)ds[𝐗̃ir𝚯r+𝐗̃if𝚯f]𝛙(t)\int_{\mathcal{S}} W_i(s)\,\beta(s, t)\,ds \approx \left[\tilde{\mathbf{X}}_{i}^r \boldsymbol{\Theta}^r + \tilde{\mathbf{X}}_{i}^f \boldsymbol{\Theta}^f\right] \boldsymbol{\psi}(t)

where 𝐗̃ir\tilde{\mathbf{X}}_{i}^r and 𝐗̃if\tilde{\mathbf{X}}_{i}^f are the 1×Qr1 \times Q_r and 1×Qf1 \times Q_f row vectors of the transformed predictor-domain design matrices for subject ii (the same matrices used in SoFR).

In matrix notation for all subjects, the functional predictor contribution to the mean is:

(𝐗̃r𝚯r+𝐗̃f𝚯f)𝚿\left(\tilde{\mathbf{X}}^r \boldsymbol{\Theta}^r + \tilde{\mathbf{X}}^f \boldsymbol{\Theta}^f\right) \boldsymbol{\Psi}

which is an n×Mn \times M matrix, where 𝚿\boldsymbol{\Psi} is the K×MK \times M matrix of response-domain spline basis evaluations.

Dual-Direction Smoothness Penalties

Because the bivariate coefficient β(s,t)\beta(s, t) lives on a two-dimensional domain, smoothness must be enforced in both directions:

Predictor-domain smoothness (ss-direction)

Following the same approach as in SoFR, the random effect coefficients use a variance-component reparametrisation:

𝚯r=σs𝐙r,vec(𝐙r)N(𝟎,𝐈)\boldsymbol{\Theta}^r = \sigma_s \cdot \mathbf{Z}^r, \quad \text{vec}(\mathbf{Z}^r) \sim N(\mathbf{0}, \mathbf{I})

where σs2\sigma_s^2 is the predictor-domain smoothing parameter. This is the standard non-centered parameterisation that separates the scale (σs\sigma_s) from the direction (𝐙r\mathbf{Z}^r), improving sampling efficiency in Stan.

Response-domain smoothness (tt-direction)

The response-domain penalty matrix 𝐒\mathbf{S} is applied row-wise to the coefficient matrices. For each row qq of 𝚯r\boldsymbol{\Theta}^r and 𝚯f\boldsymbol{\Theta}^f:

p(𝚯qr)exp(𝚯qr𝐒(𝚯qr)t2σt2),p(𝚯qf)exp(𝚯qf𝐒(𝚯qf)t2σt2)p(\boldsymbol{\Theta}_q^r) \propto \exp\left(-\frac{\boldsymbol{\Theta}_q^r \mathbf{S}\,(\boldsymbol{\Theta}_q^r)^t}{2\sigma_t^2}\right), \qquad p(\boldsymbol{\Theta}_q^f) \propto \exp\left(-\frac{\boldsymbol{\Theta}_q^f \mathbf{S}\,(\boldsymbol{\Theta}_q^f)^t}{2\sigma_t^2}\right)

where σt2\sigma_t^2 is the response-domain smoothing parameter. This ensures that β(s,t)\beta(s, t) is smooth in the tt-direction for every fixed ss.

The two smoothing parameters σs2\sigma_s^2 and σt2\sigma_t^2 are estimated from the data with weakly informative inverse-Gamma priors.

Full Bayesian Model

The complete Bayesian FoFR model combines the mean structure, residual FPCA decomposition, and all priors:

{Yi(t)=𝛍i(t)+ei(t),i=1,,n𝛍i(t)=p=1PXipαp(t)+𝒮Wi(s)β(s,t)dsei(t)=r=1Rξirϕr(t)+ϵi(t)\begin{cases} Y_i(t) = \boldsymbol{\mu}_i(t) + e_i(t), \quad i = 1, \ldots, n \\[6pt] \boldsymbol{\mu}_i(t) = \displaystyle\sum_{p=1}^P X_{ip}\,\alpha_p(t) + \int_{\mathcal{S}} W_i(s)\,\beta(s,t)\,ds \\[6pt] e_i(t) = \displaystyle\sum_{r=1}^R \xi_{ir}\,\phi_r(t) + \epsilon_i(t) \end{cases}

In matrix form for all subjects:

𝐘=𝐗𝐀t𝚿+(𝐗̃r𝚯r+𝐗̃f𝚯f)𝚿+𝚵𝚽+𝐄\mathbf{Y} = \mathbf{X}\,\mathbf{A}^t \boldsymbol{\Psi} + \left(\tilde{\mathbf{X}}^r \boldsymbol{\Theta}^r + \tilde{\mathbf{X}}^f \boldsymbol{\Theta}^f\right) \boldsymbol{\Psi} + \boldsymbol{\Xi}\,\boldsymbol{\Phi} + \boldsymbol{E}

where 𝐘\mathbf{Y} is the n×Mn \times M matrix of functional responses, 𝐗\mathbf{X} is the n×Pn \times P scalar design matrix, 𝐀\mathbf{A} is the K×PK \times P matrix of scalar predictor spline coefficients, 𝚵\boldsymbol{\Xi} is the n×Rn \times R matrix of FPCA scores, and 𝚽\boldsymbol{\Phi} is the R×MR \times M matrix of eigenfunctions.

Prior Specification

The full prior specification is:

Parameter Prior Description
𝐚p\mathbf{a}_p (scalar predictor spline coefs) p(𝐚p)exp(𝐚pt𝐒𝐚p2σp2)p(\mathbf{a}_p) \propto \exp\left(-\frac{\mathbf{a}_p^t \mathbf{S}\,\mathbf{a}_p}{2\sigma_p^2}\right) Penalized spline prior for smoothness of αp(t)\alpha_p(t)
σp2\sigma_p^2 (scalar smoothing parameter) σp2Inv-Gamma(0.001,0.001)\sigma_p^2 \sim \text{Inv-Gamma}(0.001, 0.001) Weakly informative prior on smoothing
vec(𝐙r)\text{vec}(\mathbf{Z}^r) (standardized random effects) vec(𝐙r)N(𝟎,𝐈)\text{vec}(\mathbf{Z}^r) \sim N(\mathbf{0}, \mathbf{I}) Non-centered parameterisation for ss-direction
σs2\sigma_s^2 (predictor-domain smoothing) σs2Inv-Gamma(0.0005,0.0005)\sigma_s^2 \sim \text{Inv-Gamma}(0.0005, 0.0005) Prior on predictor-domain smoothing
𝚯qr,𝚯qf\boldsymbol{\Theta}_q^r, \boldsymbol{\Theta}_q^f (row-wise) p(𝚯q)exp(𝚯q𝐒𝚯qt2σt2)p(\boldsymbol{\Theta}_q) \propto \exp\left(-\frac{\boldsymbol{\Theta}_q \mathbf{S}\,\boldsymbol{\Theta}_q^t}{2\sigma_t^2}\right) Penalized spline prior for response-domain smoothness
σt2\sigma_t^2 (response-domain smoothing) σt2Inv-Gamma(0.001,0.001)\sigma_t^2 \sim \text{Inv-Gamma}(0.001, 0.001) Prior on response-domain smoothing
ξir\xi_{ir} (FPCA scores) ξirN(0,λr)\xi_{ir} \sim N(0, \lambda_r) FPCA score distribution
λr\lambda_r (FPCA eigenvalues) λr2Inv-Gamma(0.001,0.001)\lambda_r^2 \sim \text{Inv-Gamma}(0.001, 0.001) Weakly informative prior on eigenvalues
σϵ2\sigma_\epsilon^2 (residual variance) σϵ2Inv-Gamma(0.001,0.001)\sigma_\epsilon^2 \sim \text{Inv-Gamma}(0.001, 0.001) Weakly informative prior on noise variance

Likelihood

The likelihood for the functional response is Gaussian:

logp(𝐘𝛍,σϵ2)=nM2logσϵ12σϵ2i=1nm=1M{Yi(tm)μi(tm)}2\log p(\mathbf{Y} \mid \boldsymbol{\mu}, \sigma_\epsilon^2) = -\frac{nM}{2}\log\sigma_\epsilon - \frac{1}{2\sigma_\epsilon^2}\sum_{i=1}^n \sum_{m=1}^M \{Y_i(t_m) - \mu_i(t_m)\}^2

where the mean function μi(tm)\mu_i(t_m) includes contributions from scalar predictors, functional predictors, and FPCA scores.

Relationship to SoFR and FoSR

The FoFR model nests both the SoFR and FoSR models as special cases:

Model Response Predictors Coefficient Implemented in
SoFR Scalar YiY_i Functional Wi(s)W_i(s) Univariate β(s)\beta(s) sofr_bayes()
FoSR Functional Yi(t)Y_i(t) Scalar XipX_{ip} Univariate αp(t)\alpha_p(t) fosr_bayes()
FoFR Functional Yi(t)Y_i(t) Functional Wi(s)W_i(s) + Scalar XipX_{ip} Bivariate β(s,t)\beta(s,t) + Univariate αp(t)\alpha_p(t) fofr_bayes()

The fofr_bayes() function inherits:

  • From FoSR: the response-domain spline basis 𝚿\boldsymbol{\Psi}, the FPCA residual structure, and the scalar predictor handling.
  • From SoFR: the predictor-domain spectral reparametrisation (random/fixed effect decomposition via mgcv::smooth2random()), the basis extraction, and the functional coefficient reconstruction.

The fofr_bayes() Function

Usage

fofr_bayes(
  formula,
  data,
  joint_FPCA = NULL,
  runStan = TRUE,
  niter = 3000,
  nwarmup = 1000,
  nchain = 3,
  ncores = 1,
  spline_type = "bs",
  spline_df = 10
)

Arguments

Argument Description
formula Functional regression formula, using the same syntax as mgcv::gam. The left-hand side is the functional response (an n×Mn \times M matrix in data). The right-hand side includes scalar predictors as standard terms and functional predictors via s(..., by = ...) terms. At least one functional predictor must be present; otherwise use fosr_bayes().
data A data frame containing all variables used in the model. The functional response and functional predictor should be stored as n×Mn \times M and n×Ln \times L matrices, respectively.
joint_FPCA A logical (TRUE/FALSE) vector of the same length as the number of functional predictors, indicating whether to jointly model FPCA for each functional predictor. Default is NULL, which sets all entries to FALSE.
runStan Logical. Whether to run the Stan program. If FALSE, the function only generates the Stan code and data without sampling. This is useful for inspecting or modifying the generated Stan code. Default is TRUE.
niter Total number of Bayesian posterior sampling iterations (including warmup). Default is 3000.
nwarmup Number of warmup (burn-in) iterations. These samples are discarded and not used for inference. Default is 1000.
nchain Number of Markov chains for posterior sampling. Multiple chains help assess convergence. Default is 3.
ncores Number of CPU cores to use when executing the chains in parallel. Default is 1.
spline_type Type of spline basis used for the response-domain component. Default is "bs" (B-splines). Other types supported by mgcv may also be used.
spline_df Number of degrees of freedom (basis functions) for the response-domain spline basis. Default is 10.

Return Value

The function returns a list of class "refundBayes" containing the following elements:

Element Description
stanfit The Stan fit object (class stanfit). Can be used for convergence diagnostics, traceplots, and additional summaries via the rstan package.
spline_basis Basis functions used to reconstruct the functional coefficients from the posterior samples.
stancode A character string containing the generated Stan model code.
standata A list containing the data passed to the Stan model.
scalar_func_coef A 3-d array (Q×P×MQ \times P \times M) of posterior samples for scalar predictor coefficient functions αp(t)\alpha_p(t), where QQ is the number of posterior samples, PP is the number of scalar predictors, and MM is the number of response-domain time points. NULL if no scalar predictors.
bivar_func_coef A list of 3-d arrays. Each element corresponds to one functional predictor and is an array of dimension Q×L×MQ \times L \times M, representing posterior samples of the bivariate coefficient function β(s,t)\beta(s, t) evaluated on the predictor-domain grid (LL points) and response-domain grid (MM points).
func_coef Same as scalar_func_coef; included for compatibility with the plot.refundBayes() method.
family The model family: "fofr".

Formula Syntax

The formula combines the FoSR syntax (functional response on the left-hand side) with the SoFR syntax (functional predictors via s() terms on the right-hand side):

Y_mat ~ X1 + X2 + s(sindex, by = X_func, bs = "cr", k = 10)

where:

  • Y_mat: the name of the functional response variable in data. This should be an n×Mn \times M matrix, where each row contains the functional observations for one subject across MM response-domain time points.
  • X1, X2: scalar predictor(s), included using standard formula syntax.
  • s(sindex, by = X_func, bs = "cr", k = 10): the functional predictor term:
    • sindex: an n×Ln \times L matrix of predictor-domain grid points. Each row contains the same LL observation points (replicated across subjects).
    • X_func: an n×Ln \times L matrix of functional predictor values. The ii-th row contains the LL observed values for subject ii.
    • bs: the type of spline basis for the predictor domain (e.g., "cr" for cubic regression splines).
    • k: the number of basis functions in the predictor domain.

The response-domain spline basis is controlled separately via the spline_type and spline_df arguments to fofr_bayes(). This design separates the two basis specifications: the predictor-domain basis is specified in the formula (as in SoFR), while the response-domain basis is specified via function arguments (as in FoSR).

Multiple functional predictors can be included by adding additional s() terms.

Example: Bayesian FoFR with Simulated Data

We demonstrate the fofr_bayes() function using a simulation study with a known bivariate coefficient function β(s,t)\beta(s, t) and a scalar predictor coefficient function α(t)\alpha(t).

Simulate Data

library(refundBayes)

set.seed(42)

# --- Dimensions ---
n  <- 200   # number of subjects
L  <- 30    # number of predictor-domain grid points
M  <- 30    # number of response-domain grid points

sindex <- seq(0, 1, length.out = L)   # predictor domain grid
tindex <- seq(0, 1, length.out = M)   # response domain grid

# --- Functional predictor X(s): smooth random curves ---
X_func <- matrix(0, nrow = n, ncol = L)
for (i in 1:n) {
  X_func[i, ] <- rnorm(1) * sin(2 * pi * sindex) +
                 rnorm(1) * cos(2 * pi * sindex) +
                 rnorm(1) * sin(4 * pi * sindex) +
                 rnorm(1, sd = 0.3)
}

# --- Scalar predictor ---
age <- rnorm(n)

# --- True coefficient functions ---
# Bivariate coefficient: beta(s, t) = sin(2*pi*s) * cos(2*pi*t)
beta_true <- outer(sin(2 * pi * sindex), cos(2 * pi * tindex))

# Scalar coefficient function: alpha(t) = 0.5 * sin(pi*t)
alpha_true <- 0.5 * sin(pi * tindex)

# --- Generate functional response ---
# Y_i(t) = age_i * alpha(t) + integral X_i(s) beta(s,t) ds + epsilon_i(t)
signal_scalar <- outer(age, alpha_true)                    # n x M
signal_func   <- (X_func %*% beta_true) / L               # n x M  (Riemann sum)
epsilon        <- matrix(rnorm(n * M, sd = 0.3), nrow = n) # n x M

Y_mat <- signal_scalar + signal_func + epsilon

# --- Organize data ---
dat <- data.frame(age = age)
dat$Y_mat  <- Y_mat
dat$X_func <- X_func
dat$sindex <- matrix(rep(sindex, n), nrow = n, byrow = TRUE)

The simulated dataset dat contains:

  • Y_mat: an n×Mn \times M matrix of functional response values,
  • age: a scalar predictor,
  • X_func: an n×Ln \times L matrix of functional predictor values (smooth random curves),
  • sindex: an n×Ln \times L matrix of predictor-domain grid points (identical rows).

The true data-generating model is: Yi(t)=agei0.5sin(πt)+1Ll=1LXi(sl)sin(2πsl)cos(2πt)+ϵi(t),ϵi(t)N(0,0.32)Y_i(t) = \text{age}_i \cdot 0.5\sin(\pi t) + \frac{1}{L}\sum_{l=1}^L X_i(s_l)\,\sin(2\pi s_l)\cos(2\pi t) + \epsilon_i(t), \quad \epsilon_i(t) \sim N(0, 0.3^2)

Fit the Bayesian FoFR Model

fit_fofr <- fofr_bayes(
  formula     = Y_mat ~ age + s(sindex, by = X_func, bs = "cr", k = 10),
  data        = dat,
  spline_type = "bs",
  spline_df   = 10,
  niter       = 2000,
  nwarmup     = 1000,
  nchain      = 3,
  ncores      = 3
)

In this call:

  • The formula specifies the functional response Y_mat (an n×Mn \times M matrix) with one scalar predictor age and one functional predictor X_func.
  • The predictor-domain spline basis uses cubic regression splines (bs = "cr") with k = 10 basis functions.
  • The response-domain spline basis uses B-splines (spline_type = "bs") with spline_df = 10 degrees of freedom.
  • The eigenfunctions for the residual structure are estimated automatically via FPCA using refund::fpca.face.
  • The sampler runs 3 chains in parallel, each with 2000 total iterations (1000 warmup + 1000 posterior samples).

A Note on Computation

FoFR models are the most computationally demanding among the models in refundBayes because the Stan program estimates bivariate coefficient matrices (with Qr×K+Qf×KQ_r \times K + Q_f \times K parameters per functional predictor) in addition to the scalar predictor coefficients and FPCA scores. For exploratory analyses, consider using fewer basis functions (e.g., k = 5, spline_df = 5) and a single chain. For final inference, use the full setup with multiple chains and convergence diagnostics.

Visualisation

Bivariate Coefficient β̂(s,t)\hat{\beta}(s, t)

The estimated bivariate coefficient β̂(s,t)\hat{\beta}(s,t) is stored as a 3-d array in bivar_func_coef. The posterior mean surface and comparison with the truth can be visualised using heatmaps:

# Posterior mean of the bivariate coefficient
beta_est  <- apply(fit_fofr$bivar_func_coef[[1]], c(2, 3), mean)

# Pointwise 95% credible interval bounds
beta_lower <- apply(fit_fofr$bivar_func_coef[[1]], c(2, 3),
                    function(x) quantile(x, 0.025))
beta_upper <- apply(fit_fofr$bivar_func_coef[[1]], c(2, 3),
                    function(x) quantile(x, 0.975))

# Side-by-side heatmaps: true vs estimated vs difference
par(mfrow = c(1, 3), mar = c(4, 4, 2, 1))
image(sindex, tindex, beta_true,
      xlab = "s (predictor domain)", ylab = "t (response domain)",
      main = expression("True " * beta(s, t)),
      col = hcl.colors(64, "Blue-Red 3"))
image(sindex, tindex, beta_est,
      xlab = "s (predictor domain)", ylab = "t (response domain)",
      main = expression("Estimated " * hat(beta)(s, t)),
      col = hcl.colors(64, "Blue-Red 3"))
image(sindex, tindex, beta_est - beta_true,
      xlab = "s (predictor domain)", ylab = "t (response domain)",
      main = "Difference (Est - True)",
      col = hcl.colors(64, "Blue-Red 3"))

For richer 3-d surface visualisations, use fields::image.plot() or plotly::plot_ly() with type "surface".

Scalar Coefficient Function α̂(t)\hat{\alpha}(t)

The estimated scalar predictor coefficient function can be plotted with pointwise credible intervals:

alpha_est   <- apply(fit_fofr$scalar_func_coef[, 1, ], 2, mean)
alpha_lower <- apply(fit_fofr$scalar_func_coef[, 1, ], 2,
                     function(x) quantile(x, 0.025))
alpha_upper <- apply(fit_fofr$scalar_func_coef[, 1, ], 2,
                     function(x) quantile(x, 0.975))

par(mfrow = c(1, 1))
plot(tindex, alpha_true, type = "l", lwd = 2, col = "black",
     ylim = range(c(alpha_lower, alpha_upper)),
     xlab = "t (response domain)", ylab = expression(alpha(t)),
     main = "Scalar coefficient function: age")
lines(tindex, alpha_est, col = "blue", lwd = 2)
polygon(c(tindex, rev(tindex)),
        c(alpha_lower, rev(alpha_upper)),
        col = rgb(0, 0, 1, 0.2), border = NA)
legend("topright",
       legend = c("Truth", "Posterior mean", "95% CI"),
       col = c("black", "blue", rgb(0, 0, 1, 0.2)),
       lwd = c(2, 2, 10), bty = "n")

Slices of the Bivariate Coefficient

To examine β(s,t)\beta(s, t) at fixed values of ss or tt, extract slices from the posterior:

# Fix s at the midpoint of the predictor domain and plot beta(s_mid, t)
s_mid_idx <- which.min(abs(sindex - 0.5))

beta_slice_est   <- apply(fit_fofr$bivar_func_coef[[1]][, s_mid_idx, ], 2, mean)
beta_slice_lower <- apply(fit_fofr$bivar_func_coef[[1]][, s_mid_idx, ], 2,
                          function(x) quantile(x, 0.025))
beta_slice_upper <- apply(fit_fofr$bivar_func_coef[[1]][, s_mid_idx, ], 2,
                          function(x) quantile(x, 0.975))
beta_slice_true  <- beta_true[s_mid_idx, ]

plot(tindex, beta_slice_true, type = "l", lwd = 2, col = "black",
     ylim = range(c(beta_slice_lower, beta_slice_upper)),
     xlab = "t (response domain)",
     ylab = expression(beta(s[mid], t)),
     main = paste0("Slice at s = ", round(sindex[s_mid_idx], 2)))
lines(tindex, beta_slice_est, col = "red", lwd = 2)
polygon(c(tindex, rev(tindex)),
        c(beta_slice_lower, rev(beta_slice_upper)),
        col = rgb(1, 0, 0, 0.2), border = NA)
legend("topright",
       legend = c("Truth", "Posterior mean", "95% CI"),
       col = c("black", "red", rgb(1, 0, 0, 0.2)),
       lwd = c(2, 2, 10), bty = "n")

Numerical Summary

# RMSE of the bivariate coefficient surface
cat("RMSE of beta(s,t):", sqrt(mean((beta_est - beta_true)^2)), "\n")

# RMSE of the scalar coefficient function
cat("RMSE of alpha(t): ", sqrt(mean((alpha_est - alpha_true)^2)), "\n")

Inspecting the Generated Stan Code

Setting runStan = FALSE allows you to inspect or modify the Stan code before running the model:

# Generate Stan code without running the sampler
fofr_code <- fofr_bayes(
  formula     = Y_mat ~ age + s(sindex, by = X_func, bs = "cr", k = 10),
  data        = dat,
  spline_type = "bs",
  spline_df   = 10,
  runStan     = FALSE
)

# Print the generated Stan code
cat(fofr_code$stancode)

The generated Stan code includes all five standard blocks (data, transformed data, parameters, transformed parameters, model). The parameters block declares matrix-valued parameters for the bivariate coefficients, and the model block includes both ss-direction and tt-direction smoothness priors.

Practical Recommendations

  • Number of predictor-domain basis functions (k): Controls the flexibility of β(s,t)\beta(s, t) in the ss-direction. Start with k = 10 for exploration. In practice, 10–20 basis functions are typically sufficient, but this depends on the complexity of the true β(s,t)\beta(s, t).

  • Number of response-domain basis functions (spline_df): Controls the flexibility in the tt-direction. The default spline_df = 10 is often adequate for moderately smooth coefficient functions.

  • Spline types (bs and spline_type): Use "cr" (cubic regression splines) for general functional data. Use "cc" (cyclic cubic regression splines) when the functional data are periodic. The predictor-domain basis (bs) and response-domain basis (spline_type) may use different types.

  • Sample size and grid resolution: FoFR requires estimating a surface β(s,t)\beta(s, t), which demands more data than SoFR or FoSR. As a rough guide, ensure n>k×𝚜𝚙𝚕𝚒𝚗𝚎_𝚍𝚏n > k \times \texttt{spline\_df}.

  • Number of iterations and chains: FoFR models have more parameters than SoFR or FoSR. A recommended starting point is niter = 3000, nwarmup = 1000, nchain = 3. Increase iterations if convergence diagnostics indicate issues.

  • Convergence diagnostics: After fitting, examine traceplots and R̂\hat{R} statistics using the rstan package:

    rstan::traceplot(fit_fofr$stanfit, pars = c("sigma_eps", "sigmabr_1", "sigma_t_1"))
    print(fit_fofr$stanfit, pars = c("sigma_eps", "sigmabr_1", "sigma_t_1"))

    Warnings about bulk ESS, tail ESS, or R̂>1.01\hat{R} > 1.01 indicate that more iterations or chains may be needed.

  • Common grid assumption: The current implementation assumes that both the functional response and functional predictors are observed on common grids across all subjects. Subject-specific observation grids are not yet supported.

  • Multiple functional predictors: Multiple functional predictors can be included by adding additional s() terms in the formula. Each functional predictor receives its own pair of smoothing parameters (σs,j2\sigma_{s,j}^2, σt,j2\sigma_{t,j}^2).

References

  • Jiang, Z., Crainiceanu, C., and Cui, E. (2025). Tutorial on Bayesian Functional Regression Using Stan. Statistics in Medicine, 44(20–22), e70265.
  • Crainiceanu, C. M., Goldsmith, J., Leroux, A., and Cui, E. (2024). Functional Data Analysis with R. CRC Press.
  • Goldsmith, J., Zipunnikov, V., and Schrack, J. (2015). Generalized Multilevel Function-on-Scalar Regression and Principal Component Analysis. Biometrics, 71(2), 344–353.
  • Ramsay, J. O. and Silverman, B. W. (2005). Functional Data Analysis, 2nd Edition. Springer.
  • Ivanescu, A. E., Staicu, A.-M., Scheipl, F., and Greven, S. (2015). Penalized Function-on-Function Regression. Computational Statistics, 30(2), 539–568.
  • Scheipl, F., Staicu, A.-M., and Greven, S. (2015). Functional Additive Mixed Models. Journal of Computational and Graphical Statistics, 24(2), 477–501.
  • Carpenter, B., Gelman, A., Hoffman, M. D., et al. (2017). Stan: A Probabilistic Programming Language. Journal of Statistical Software, 76(1), 1–32.