Posts

Showing posts from July, 2023

NMLS - 1. Clustering - Kmeans - Unsupervised Learning - Introductory Machine Learning

In 2022 I started the new  Machine Learning Specialization by Andrew NG. I completed it's first two courses in 2022. I'll sgare the jupyter notebooks from this specialization soon. Now, in 2023 when I am finally free from my M1, I am going to complete the specialization. Here is the first Assignment of course 3 (last course) on Clustering. Procedure : Distances between each data-point and all of the cluster centroids were computed The cluster centroid closest to the each data-point was assigned to that data-point The average positions of all of the data-points in a cluster were computed Corresponding centroids were moved to these new average positions The procedure was repeated for a specific number of iterations. Relevant  github link Relevant  video explanation (in Urdu)

CONVOLUTIONAL NETWORK from Scratch - Binary Classification

 I used the world renowned dogs-vs-cats dataset from kaggle, to train a convolutional neural network. My procedure was as follows: -   I downloaded the dataset from kaggle  -   Created directories and transferred the relevant data.  I created an ' original_dataset_dir ' and transferred all of the data (data of all classes) to this directory. I then created a base directory: ' base_dir '. I created the folders 'train', 'validation' and 'test' inside base directory. Then I created the folders 'cats', 'dogs', inside the folders created in last step. Then I transferred the data according to the following ratio: (Train, Validation, Test) = (50, 25, 25) -  Initiated a small convolutional network In the conv base, I created 4 successive 'conv2D' layers, each followed by a 'Maxpooling2D' layer, without any padding. In the classifier base, I added a 'Flatten' layer, followed by two Dense Layers -  Compiled the mod...