## Regression Details

1.4.1 Weighting Regression Few things are as confusing to applied researchers as the role of sample weights. Even now, 20 years post - Ph. D., we read the section of …

## Getting a Little Jumpy: Regression Discontinuity Designs

But when you start exercising those rules, all sorts of processes start to happen and you start to find out all sorts of stuff about people... Its just a way …

## Two-Sample IV and Split-Sample IVF

GLS estimates of Г in (4.3.1) are consistent because E The 2SLS minimand can be thought of as GLS applied to equation (4.3.1), after multiplying by /N to keep the …

## The Bias of Robust Standard Errors*

matrix with ith row mi, and In is the N x N identity matrix. Then ei = mie, and E (ef) = E (m'iee'mi) = m'i^mi To simplify further, write …

## Peer Effects

A vast literature in social science is concerned with peer effects. Loosely speaking, this means the causal effect of group characteristics on individual outcomes. Sometimes regression is used in an …

## Limited Dependent Variables and Marginal Effects

Many empirical studies involve variables that take on only a limited number of values. An example is the Angrist and Evans (1998) investigation of the effect of childbearing on female …

## Sharp RD

Sharp RD is used when treatment status is a deterministic and discontinuous function of a covariate, xj. Suppose, for example, that 1 if Xj > xo . (6.1. 0 if …

## IV with Heterogeneous Potential Outcomes

The discussion of IV up to this point postulates a constant causal effect. In the case of a dummy variable like veteran status, this means Y^—Yoi = p for all …

## Clustering and Serial Correlation in Panels

8.2.1 Clustering and the Moulton Factor Bias problems aside, heteroskedasticity rarely leads to dramatic changes in inference. In large samples where bias is not likely to be a problem, we …

## Limited Dependent Variables Reprise

In Section 3.4.2, we discussed the consequences of limited dependent variables for regression models. When the dependent variable is binary or non-negative, say, employment status or hours worked, the CEF …

## Why is Regression Called Regression and What Does Regression-to-the — mean Mean?

The term regression originates with Francis Galton’s (1886) study of height. Galton, who worked with samples of roughly-normally-distributed data on parents and children, noted that the CEF of a child’s …

## Fuzzy RD is IV

Fuzzy RD exploits discontinuities in the probability or expected value of treatment conditional on a covariate. The result is a research design where the discontinuity becomes an instrumental variable for …

## Local Average Treatment Effects

In an IV framework, the engine that drives causal inference is the instrument, zj, but the variable of interest is still Dj. This feature of the IV setup leads us …

## Serial Correlation in Panels and Difference-in-Difference Models

Serial correlation - the tendency for one observation to be correlated with those that have gone before - used to be Somebody Else’s Problem, specifically, the unfortunate souls who make …

## The Bias of 2SLSF

It is a fortunate fact that the OLS estimator is not only consistent, it is also unbiased. This means that in a sample of any size, the estimated OLS coefficient …

## Appendix: Derivation of the average derivative formula

Begin with the regression of Yi on Si : Cov(Yj, Si) _ E[h(Si)(Si - E[Si])] V(Si) _ E[Si(Si - E[Si])] ' Let К-ж = lim h (t). By the fundamental …

## Quantile Regression

Here’s a prayer for you. Got a pencil? . . . ‘Protect me from knowing what I don’t need to know. Protect me from even knowing that there are things …

## The Compliant Subpopulation

The LATE framework partitions any population with an instrument into a set of three instrument-dependent subgroups, defined by the manner in which members of the population react to the instrument: …

## Fewer than 42 clusters

Bias from few clusters is a risk in both the Moulton and the serial correlation contexts because in both cases inference is cluster-based. With few clusters, we tend to underestimate …

## Regression and Causality

Section 3.1.2 shows how regression gives the best (MMSE) linear approximation to the CEF. This under­standing, however, does not help us with the deeper question of when regression has a …

## Appendix

Derivation of Equation (4.6.8) Rewrite equation (4.6.7) as follows Yij = p* + - kqt і + (^o + nxjSj + v f, where tі =Si — Sj. Since tі …

## Instrumental Variables in Action: Sometimes You Get What You Need

Anything that happens, happens. Anything that, in happening, causes something else to happen, causes something else to happen. Anything that, in happening, causes itself to happen again, happens again. It …

## The Quantile Regression Model

The starting point for quantile regression is the conditional quantile function (CQF). Suppose we are interested in the distribution of a continuously-distributed random variable, Y;, with a well-behaved density (no …

## IV in Randomized Trials

The language of the LATE framework is based on an analogy between IV and randomized trials. But some instruments really come from randomized trials. If the instrument is a randomly …

## Appendix: Derivation of the simple Moulton factor

Write and where Lg is a column vector of ng ones and G is the number of groups. Note that Let tg = 1 + (ng — 1 )p, so …

## The Omitted Variables Bias Formula

The omitted variables bias (OVB) formula describes the relationship between regression estimates in models with different sets of control variables. This important formula is often motivated by the notion that …

## Parallel Worlds: Fixed Effects, Differences-in-differences, and Panel Data

The first thing to realize about parallel universes... is that they are not parallel. Douglas Adams, Mostly Harmless (1995) The key to causal inference in chapter 3 is control for …

## IV and causality

We like to tell the IV story in two iterations, first in a restricted model with constant effects, then in a framework with unrestricted heterogeneous potential outcomes, in which case …

## Censored Quantile Regression

Quantile regression allows us to look at features of the conditional distribution of Yi when part of the distribution is hidden. Suppose you have have data of the form Yi;obs …

## Counting and Characterizing Compliers

We’ve seen that, except in special cases, each instrumental variable identifies a unique causal parameter, one specific to the subpopulation of compliers for that instrument. Different valid instruments for the …

We’ve made the point that control for covariates can make the CIA more plausible. But more control is not always better. Some variables are bad controls and should not be …

## Individual Fixed Effects

One of the oldest questions in Labor Economics is the connection between union membership and wages. Do workers whose wages are set by collective bargaining earn more because of this, …

## Two-Stage Least Squares

The reduced-form equation, (4.1.4b), can be derived by substituting the first stage equation, (4.1.4a), into the causal relation of interest, (4.1.6), which is also called a “structural equation” in simultaneous …

## The Quantile Regression Approximation Property*

The CQF of log wages given schooling is unlikely to be exactly linear, so the assumptions of the original quantile regression model fail to hold in this example. Luckily, quantile …

## Generalizing LATE

The LATE theorem applies to a stripped-down causal model where a single dummy instrument is used to estimate the impact of a dummy treatment with no covariates. We can generalize …

## Как с нами связаться:

Украина:
г.Александрия
тел./факс +38 05235  77193 Бухгалтерия
+38 050 512 11 94 — гл. инженер-менеджер (продажи всего оборудования)

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

## Контакты для заказов шлакоблочного оборудования:

+38 096 992 9559 Инна (вайбер, вацап, телеграм)
Эл. почта: inna@msd.com.ua