Skip to main content

Regression & Log linear models

Regression models can be used to approximate the given data. The data are modeled to fit a straight line.

 y = wx + b

where 

y  - response variable

x  - predictor variable

w , b  - regression coefficients

The coefficient specify the slope of the line and y intercept. The method of least squares shall be used to solve the coefficients.

Log linear models approximate discrete multidimensional probability distributions. Log-linear models can be used to estimate the probability of each point in a multidimensional space for a set of discretized attributes, based on a smaller subset of dimensional combinations. This allows a higher-dimensional data space to be constructed from lower-dimensional spaces.

Regression and log-linear models can both be used on sparse data and skewed data too. Regression can be computationally intensive when applied to high-dimensional data.

Popular posts from this blog

Exercise 2 - Amdahl's Law

A programmer has parallelized 99% of a program, but there is no value in increasing the problem size, i.e., the program will always be run with the same problem size regardless of the number of processors or cores used. What is the expected speedup on 20 processors? Solution As per Amdahl's law, the speedup,  N - No of processors = 20 f - % of parallel operation = 99% = 1 / (1 - 0.99) + (0.99 / 20) = 1 / 0.01 + (0.99 / 20) = 16.807 The expected speedup on 20 processors is 16.807

Gaussian Elimination - Row reduction Algorithm

 Gaussian elimination is a method for solving matrix equations of the form, Ax=b.  This method is also known as the row reduction algorithm. Back  Substitution Solving the last equation for the variable and then work backward into the first equation to solve it.  The fundamental idea is to add multiples of one equation to the others in order to eliminate a variable and to continue this process until only one variable is left. Pivot row The row that is used to perform elimination of a variable from other rows is called the pivot row. Example: Solving a linear equation The augmented matrix for the above equation shall be The equation shall be solved using back substitution. The eliminating the first variable (x1) in the first row (Pivot row) by carrying out the row operation. As the second row become zero, the row will be shifted to bottom by carrying out partial pivoting. Now, the second variable (x2)  shall be eliminated by carrying out the row operation again. ...

Decision Tree Scalability Methods

 The scalable decision tree induction methods are RainForest and BOAT. RainForest method maintains an AVC set for each attribute at each node. AVC stands for Attribute Value Classlabel. BOAT , stands for Bootstrapped Optimistic Algorithm for Tree construction, uses a statistical technique known as bootstrapping., by which several smaller subsets are created. The several trees are created using the subsets and finally the full tree is generated using the trees created by smaller subsets. BOAT was found to be two to three times faster than RainForest.