Skip to main content

Function - One to One & Onto

 One to One

A function f from A to B is called one-to-one (or 1-1) if whenever f (a) = f (b) then a = b. No element of B is the image of more than one element in A.

In a one-to-one function, given any y there is only one x that can be paired with the given y.

f is one-to-one (injective) if f maps every element of A to a unique element in B. In other words no element of B are mapped to by two or more elements of A.

Onto

A function f from A to B is called onto if for all b in B there is an a in A such that f (a) = b. All elements in B are used.

f is onto (surjective)if every element of B is mapped to by some element of A. In other words, nothing is left out.

f is one-to-one onto (bijective) if it is both one-to-one and onto. In this case the map f is also called a one-to-one correspondence.

Notice that “f is one-to-one” is asserting uniqueness, while “f is onto” is asserting existence.

Let A and B be two finite sets such that there is a function f: A -> B. We claim the following theorems:

If f is one to one then |A|  |B|.

If f is onto then |A|  |B|.

If f is both one-to-one and onto then |A| = |B|.

Popular posts from this blog

Exercise 2 - Amdahl's Law

A programmer has parallelized 99% of a program, but there is no value in increasing the problem size, i.e., the program will always be run with the same problem size regardless of the number of processors or cores used. What is the expected speedup on 20 processors? Solution As per Amdahl's law, the speedup,  N - No of processors = 20 f - % of parallel operation = 99% = 1 / (1 - 0.99) + (0.99 / 20) = 1 / 0.01 + (0.99 / 20) = 16.807 The expected speedup on 20 processors is 16.807

Decision Tree Classification

 A decision tree is a flowchart-like tree structure. The topmost node in a tree is the root node. The each internal node (non-leaf node) denotes a test on an attribute and each branch represents an outcome of the test. The each leaf node (or terminal node) holds a class label. Decision trees can handle multidimensional data.  Some of the decision tree algorithms are Iterative Dichotomiser (ID3), C4.5 (a successor of ID3), Classification and Regression Trees (CART). Most algorithms for decision tree induction  follow a top-down approach.  The tree starts with a training set of tuples and their associated class labels. The algorithm is called with data partition, attribute list, and attribute selection method, where the data partition is the complete set of training tuples and their associated class labels. The splitting criterion is determined by attribute selection method which indicates the splitting attribute that may be splitting point or splitting subset. Attribu...

Exercise 1 - Amdahl's Law

A programmer is given the job to write a program on a computer with processor having speedup factor 3.8 on 4 processors. He makes it 95% parallel and goes home dreaming of a big pay raise. Using Amdahl’s law, and assuming the problem size is the same as the serial version, and ignoring communication costs, what is the speedup factor that the programmer will get? Solution Speedup formula as per Amdahl's Law, N - no of processor = 4 f - % of parallel operation = 95% Speedup = 1 / (1 - 0.95) + (0.95/4) = 1 / 0.5 + (0.95/4) Speedup = 3.478 The programmer gets  3.478 as t he speedup factor.