Friday, December 1, 2017

Machine Learning

Machine Learning Notes
(It is a work in progress.)

    Machine Learning is to generalize.

Classic Problem
    Normal Programming: "Hello world"
    Machine Learning:  MNIST

    Problems --> Tools--->Metrics  (apply to all problems?)
    Data to generalize --> Use different algorithms --> Monitor performance of algorithms and adjust

Key Words
        Discreet output
        Continuous numeric output
     Gradient descent, Backpropagation, Cost function,
           Any loss consisting of a negative log-likelihood between the empirical distribution
           defined by the training set and the probability distribution defined by model. For example,
           Mean Squared Error: cross-entropy between empirical distribution and a Gaussian model

     Activation function
           Step function
                discrete 0, 1
           Sigmoid function
           Tanh function

           Rectified Linear function (ReLU)

     Training data set
           Train parameter
     Validation data set
           Train Hyperparameter
     Test data set
     Bias, Variance
         Linked to capacity, underfitting, overfitting

     Closed-form solution

     Weight, Bias, Learning rate
           for example: Learning rate

    Kernel trick
    Maximum likelihood estimation
             Point estimate of variables

     Bayesian estimation
             Full distribution of variables

        Modification to ML algorithms, intending to reduce generalization error, not training error
        Example: weight decay for linear regression

            To have small gap between training error and test error
     Supervised Learning
            features + labels
            Nonprobabilistic SL
                  K-Nearest Neighbor
             Decision Tree
     Unsupervised Learning
            features without labels
     Reinforcement Learning
         Modify or filter data before feeding it to learning algorithms
         Feature selection
         Feature extraction
         Dimension reduction (PCA, manifold learning)
         Kernel approximation

    Cross-validation schemes
         Stratified K-fold
         Leave-one-out (small amount of data)

Math behind ML


    Linear Regression
        Find optimal weights by solving normal equations

    Logistic Regression
         No closed-form solution. Maximizing the log-likelihood, or minimizing the negative log-likelihood using gradient descent.

    Neural Network
    RNN (Recurrent Neural Network)
    CNN (Convolutional Neural Network)

    Decision Tree

    Identification Tree

    Naive Bayes
           Features independent of each other
           Conditional Probability Model
           Highly scalable, only requires small amount of training data
           Linear Performance Time
           Generally outperformed by other algorithms, SVM...
    Support Vector Machines

    Random Forest

Test Methodologies
   Leave one out   LOO
       for small amount of data

   Data split (80/20)

    Tensorflow, Scikit-learn
    Spark MLLib, Spark ML,  Weka,

Use cases
    Linear Regression
          House size---> House price in a community
    Naive Bayes
          Document classification: separate legitimate emails from spam emails
          For example, based on key words: cheap, free

    When to use which algorithm(s)?

Famous Applications
       Alphago vs Lee Sedol

        Netflix movie recommendations

     No ML algorithm is universally better than any other algorithm.
     Understand data distribution, and pick proper algorithm(s).

    Machine learning series from Luis Serrano  (best explanations)

    (AWS machine learning service)

    (Spark MLlib example)


Tuesday, January 31, 2017

String valueOf() pitfalls

What will the console output of this program?

public class TestStringValueOf {

public static void main(String[] args) {

      public static void testStringValueOfChar() {
char a = 'a';
String str1 = String.valueOf(a);
String str2 = String.valueOf(a);
System.out.println("char comparison:" + (str1 == str2));

double d = 12.3d;
String str3 = String.valueOf(d);
String str4 = String.valueOf(d);
System.out.println("double comparison:" + (str3 == str4));

boolean b = false;
String str5 = String.valueOf(b);
String str6 = String.valueOf(b);
System.out.println("boolean comparison:" + (str5 == str6));

Object o = null;
String str7 = String.valueOf(o);
String str8 = String.valueOf(o);
System.out.println("Object null comparison:" + (str7 == str8));

Object notNull = new Object();
String str9 = String.valueOf(notNull);
String str10 = String.valueOf(notNull);
System.out.println("Object Not null comparison:" + (str9 == str10));

see the end of this article for the output.

Overall, the string comparison should use 'equals' no matter how String objects were created.

-------console output----------

char comparison:false
double comparison:false
boolean comparison:true
Object null comparison:true
Object Not null comparison:false