0% found this document useful (0 votes)
47 views2 pages

Linear Latent Variable Models in R: Odel Building ON Linear Constraints

This document provides a reference card summarizing how to: 1. Build linear latent variable models (LLVMs) in R, including adding and removing variables, associations, and constraints. 2. Inspect LLVMs, such as examining parameters, extracting variable names, and listing pathways between variables. 3. Perform statistical inference with LLVMs, like estimating parameters, conducting likelihood ratio tests, and calculating effects.

Uploaded by

Pascal Grangeon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views2 pages

Linear Latent Variable Models in R: Odel Building ON Linear Constraints

This document provides a reference card summarizing how to: 1. Build linear latent variable models (LLVMs) in R, including adding and removing variables, associations, and constraints. 2. Inspect LLVMs, such as examining parameters, extracting variable names, and listing pathways between variables. 3. Perform statistical inference with LLVMs, like estimating parameters, conducting likelihood ratio tests, and calculating effects.

Uploaded by

Pascal Grangeon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

lava REFERENCE CARD

Linear Latent Variable Models in R

M ODEL BUILDING N ON - LINEAR CONSTRAINTS


• Initialize model (empty or multivarite regr. model) m <- lvm(); m <- lvm(c(y1,y2)~x+z) • Non-linear parameter constraints constrain(m,psi~beta+gamma) <- f unct
• Initialize from list of regression models m <- lvm(list(y~x,y~z,...)) • Non-linear regression (covariate x) constrain(m,psi~beta+x) <- f unct
• Add extra regression associations (slopes) regression(m) <- c(y1,y3)~u • Add extra parameters to model parameter(m) <- ~beta+gamma
• Add correlation between residual terms covariance(m) <- y1~y2+y3 • Add predictor/exogenous variable to model exogenous(m) <- ~x1+x2
• Remove associations between variables cancel(m) <- ~y1+y2 • Random slopes (x name of covariate) regression(m) <- y1~f(eta,x)
• Add variables addvar(m) <- ~y1+y2 • Print non-linear constraints constrain(m)
• Remove variables kill(m) <- ~y1+y2
• Code as latent (reverse with arg. clear=TRUE) latent(m) <- ~y1+y2 M ODEL INSPECTION
• Binary variables; library(lava.tobit) binary(m) <- ~y1+y2
• Examine parameter constraints intercept,covariance,regression,constrain
E QUALITY CONSTRAINTS • Extract variable names exogenous,endogenous,latent,manifest,vars
• Submodel (see also measurement) subset(m,~y1+y2+eta+x)
• Intercepts • List parameter names coef(m,mean=TRUE,labels=FALSE,...)
◦ Constrain intercepts to be identical intercept(m) <- c(y1,y2)~f(a) • Parents and children of nodes (union)) parents(m,~y1+y2);children(m,~x1+x2)
◦ Simultaneously fix several intercepts intercept(m,~y1+y2) <- list("a",2) • Extract (directed) pathways between variables path(m,y~x)

• Variance/covariance parameters P LOTTING


◦ Fix variance term and covariance between resid- covariance(m) <- y1~f(y1,v1)+f(y2,1)
ual to v1 resp. 1 • Plot method (lvm and lvmfit) plot(m,labels=TRUE,...)
◦ Fix multiple variance parameters covariance(m,~y1+y2) <- list("a",2) • Change appearance of nodes nodecolor(m,~y1+x,labcol=c("red","blue"),
◦ Simultaneously fix several covariance parameters covariance(m,c(y1,y2)~y2+y3) <- border,lwd=2, ...) <- c("blue","red")
list(2,"a","b",1) • Change label and appearance of edges edgelabels(m,y~x+z,col,...) <-
expression(rho)
• Slope/regression parameters
• Change labels of nodes (e.g. math expressions) labels(m) <- c(eta=expression(eta))
◦ y1 = x + az + · · · regression(m,c(y1,y2)~x+z) <-
y2 = x + bx + · · · list(1,"a",2,"b") • Extract graphNEL object (library(Rgraphviz)) Graph(m)

◦ y1 = x + · · · , y2 = az + · · · regression(m,c(y1,y2)~x+z) <- list(1,"a") S TATISTICAL I NFERENCE


◦ yi = ax + · · · regression(m) <- c(y1,y2,y3)~f(x,a)
• Estimate parameters (default MLE) e <- estimate(m,data,estimator,...)
• Fix parameters defined by index (see coef) parfix(m,c(3,4,12)) <- list(1,"a",2)
• Estimate multigroup model (default MLE) estimate(list(m,...),list(data,...),...)
• Label all free parameters (see multigroup) m <- baptize(m)
• Estimate under MAR assumption estimate(m,data,missing=TRUE,...)
• Remove linear constraints by fixing to NA (applies regression(m) <- c(y1,y2)~f(x,NA)
• Likelihood ratio tests vs. saturated model compare(e)
also to the intercept and covariance methods)
• Likelihood ratio tests compare(e1,e2,e3,...)
• Bracket notation. Define intercept and variance of regression(m) <- y[0:v]~f(x[a,1],b)
residual of y to 0 and ’v’ and of x to ’a’ and 1. And • Model indices based on score tests (or Wald tests) modelsearch(e,...)
define E(y|x) = b · x. • Identify empirical equivalent models equivalence(e,y~x,k=1,...)
• Calculate indirect and total effects of x on y effects(e, y~x)
S IMULATION • Non-linear constraints and approx. std.errors constraints(e)
• Mixtures of LLVM; library(lava.mixture) mixture(list(m1,m2),data)
• Simulate 100 observations from model m sim(m,100,...)
• Extract various likelihood summaries coef,score,information,logLik,AIC,gof
• Simulate with the slope-parameter of x on y set to sim(m,100,p=c("y"=1, "y<-x"=-2),...)
−2, and intercept of y to 1 (see coef) • Clustered correlated data estimate(m,data,cluster="id",...)
• Define conditional distribution distribution(m,~y1+y2) <- • Robust standard errors coef(e,type="robust")
function(n,mu,var,...) ... • Test for linearity; library(gof) cumres(e,...)
◦ Predefined distributions binomial.lvm,,uniform.lvm,normal.lvm,poisson.lvm,weibull.lvm(shape,scale,...),...
• Non-parametric bootstrap bootstrap(e,R=100,...)
• Define functional form of predictor on response functional(m,y~x) <- f unct • Likelihood-based confidence limits confint(e,idx,profile=TRUE,...)

c 2011 Klaus Kähler Holst



E XAMPLES
Linear Latent Variable Models in R

S TRUCTURAL E QUATION M ODEL S IMULATION

MIMIC model Weibull with exponential distributed censoring


y1 0.7939 y2 1.208 y3 1.238

> m <- lvm(c(y1,y2,y3)~u) > m <- lvm(y~x1+x2+x3)


> regression(m) <- u ~ x1+x2
1 0.9925 1.009

> distribution(m,~y) <- weibull.lvm(shape=0.5,cens=rexp)


> latent(m) <- ~ u u 1.349
> distribution(m,~x3) <- binomial.lvm()
> d <- sim(m,100) > d <- sim(m,100)
> e <- estimate(m,d) 1.105 0.9037

> plot(e) x1 x2
M ULTIVARIATE PROBIT

R ANDOM REGRESSION
> m <- lvm(c(y1,y2)~f(x,b)+f(z,b))
> binary(m) <- ~y1+y2
Random slopes allowing for unbalanced designs
> covariance(m) <- y1~y2
> estimate(m,data,control=list(trace=1))
> m <- lvm(c(y1,y2,y3)~f(eta,1))
> regression(m,c(y1,y2,y3)~u) <- list("x1","x2","x3") M ULTIGROUP ANALYSIS I
> intercept(m,~y1+y2+y3) <- list("mu")
> covariance(m,~y1+y2+y3) <- list("v","v","v") P
log L(θ|d) = i log Li (θ|di )
> latent(m) <- ~u+eta
> estimate(m,data,missing=TRUE)
> estimate(list(m1,m2,m3),list(d1,d2,d3))
G RAPHICS I I NDIRECT EFFECTS . TOBIT /P ROBIT MODEL
y u E(y | x, z) = a(x + z)

> m <- lvm(list(y~z+x,z~x))


w v b > d <- transform(sim(m,100),z=factor(z>0),y=Surv(ifelse(y<1,y,1),y<1))
> m <- lvm(list(y ~ b+v+w, c(b,w) ~ x+z, u ~ b)) > e <- estimate(m,d)
> latent(m) <- ~b > effects(e,y~x)
> plot(m) x z
N ON - LINEAR REGRESSION
G RAPHICS II Bi-variate non-linear regression with random intercept. Estimated via Fischer scoring

> m0 <- lvm(c(y1,y2)~f(x,0)+f(eta,1))


> m <- lvm(y~x+z+u) u > latent(m0) <- ~eta
x
> labels(m) <- c(y=expression(psi), z=expression(zeta)) > covariance(m0) <- c(y1,y2) ~ f(0.01)
> nodecolor(m,~y+z+x,border=c("white","white","black"), > covariance(m0) <- c(eta) ~ f(0.01)
+ labcol="white", lwd=c(0,0,5)) <- > d <- sim(m0,50)[,manifest(m0)]
ρ
+ c("orange","indianred","lightgreen") > d <- transform(d,
> edgelabels(m,y~z+x, cex=c(2,3), col=c("orange","black"), y1=y1+pnorm(2*x),
+ labcol="darkblue",lwd=c(3,1)) <- expression(phi,rho) y2=y2+pnorm(2*x))
> plot(m,layoutType="circo") ψ ζ
φ
> m <- m0
> parameter(m) <- ~ nu+alpha+xi
> intercept(m) <- c(y1,y2) ~ f(mu)
I NSTRUMENTAL VARIABLE > covariance(m) <- c(y1,y2) ~ f(v)
> covariance(m) <- eta ~ f(zeta)
IV estimator (not available with e.g. non-recursive structures) > constrain(m, mu ~ x + alpha + nu + xi) <- function(x) pnorm(x[1]*x[2]+x[4]) + x[3]
> e <- estimate(m,d,control=list(trace=1,method="NR",gamma=0.99))
> estimate(m,data,estimator="IV")

c 2011 Klaus Kähler Holst

You might also like