Posts
Showing posts from August, 2018
Week 7 (13/8/18 - 17/8/18)
- Get link
- X
- Other Apps
DAY 1 Today , we first started with our ISB video lecture no.3 which was about "Bayesian Learning".In this lecture we learnt about different distributions(Bernoulli, Categorical & continuous probability densities).Next we learnt about Joint probability distributions and marginalisation.It was explained using the concept of generative and discriminative model. After break Vikram sir give us overview about the "Feature Engineering". Feature engineering is the process of using domain knowledge of the data to create features that make machine learning algorithms work. If feature engineering is done correctly, it increases the predictive power of machine learning algorithms by creating features from raw data that help facilitate the machine learning process.After that sir discussed the problems that we were facing in EDA. DAY2 Today in morning , first we had ses
Week 6 (6/8/2018 - 10/8/2018)
- Get link
- X
- Other Apps
ISB Videos Continuing our ISB course on Unsupervised Learning, we completed 2 videos this week whose topics are as follows: Introduction to Bayesian Learning Bayes' Theorem : Bayes’ theorem describes how the conditional probability of an event or a hypothesis can be computed using evidence and prior knowledge. The Bayes’ theorem is given by: P ( θ | X ) = P ( X | θ ) P ( θ ) P ( X ) P ( θ ) - P ( θ ) Prior Probability is the probability of the hypothesis θ θ being true before applying the Bayes’ theorem. Prior represents the beliefs that we have gained through past experience, which refers to either common sense or an outcome of Bayes’ theorem for some past observations. P ( X | θ ) P ( X | θ ) - Likelihood is the conditional probability of the evidence given a hypothesis. P ( X ) P ( X ) - Evidence term denotes the probability of evidence or data. Types of distributions: Binomial distribution Bernoullie distribution Mutin
Week 5 (30/7/18 - 3/8/18 )
- Get link
- X
- Other Apps
Hands-on ML Algorithms During the session with Vikram Sir, we did hands on various machine learning algorithms namely linear regression and logistic regression. We implemented logistic regression on a dataset provided by sir. The Steps are as follows: Perform EDA on the dataset. Using Feature Engineering techniques extract important features from the dataset. Model Building EDA Exploratory Data Analysis helps to remove noise , missing values from the dataset. EDA gives a better insight about the dataset . Feature Engineering Feature engineering is the process of using domain knowledge of the data to create features that make machine learning algorithms work. If feature engineering is done correctly, it increases the predictive power of machine learning algorithms by creating features from raw data that help facilitate the machine learning process. Model Building Pyhton's scikit learn library has various built-in ML Algo which are efficient enough. We l