kaggle datasets for ab testing

We’ll use sigmoid activation function since we only need 0,1 as outputs to classify into two classes of infected and uninfected. CIFAR-10: A large image dataset of 60,000 32×32 colour images split into 10 classes. Explore Popular Topics Like Government, Sports, Medicine, Fintech, Food, More. While building a Deep Learning model, the first task is to import datasets online and this task proves to be very hectic sometimes. ... Top complementary datasets. Unzip this to a convenient folder on your disk to re-create the folder structure. See figure below: The following code is used to create the VGG16 model : It produces the following output — note that this is the structure of just the convolutional base (as we set include_top=False). What makes a good machine learning use-case? 7 months ago Balanced COVID-19 Positive X-Rays and Healthy X-Rays. We’ll use these later to traverse through our infected and uninfected cell images and to list down each directory content. Kaggle.com is one of the most popular websites amongst Data Scientists and Machine Learning Engineers. Kaggle | by at 1-minute resolution crypto currency pairs. This is an example of what I'm supposed to produce: Run the above cell to authorize connection to your drive. Features. Here are the results from training the fine tuned model. The architecture is simple enough to understand, even for a novice deep learning practitioner. 179. Andrey is a Kaggle Notebooks as well as Discussions Grandmaster with ranks 3 and 10 respectively. Complex models require huge amounts of data and very capable hardware (read lots of memory and GPU!) Let’s explore the data through visualization to understand it better. So we’ll create directories according to this. of shape (observations and . None, context will have kaggle CLI mocked out. His notebooks are amongst the most accessed ones by the beginners. I welcome any comments and suggestions. In this article, I will show you how you can use a pre-trained Keras model to classify Cat and Dog images and achieve ~97% accuracy on the test dataset. Kaggle had hosted this very popular contest in late 2013 to classify cat & dog images into the appropriate class. The VGG16 model is widely used Convnet architecture for ImageNet. Kaggle is the world’s largest data science community with powerful tools and resources to help you achieve your data science goals. The Data generator will take each sub folder inside training and testing as a single class. COVID-19 Radiography Database. You can find Keras Image Data Generator Class here. In conclusion, we get a 92–93% accuracy on the test dataset, which is not bad. Kaggle has not only provided a professional setting for data science projects, but has developed an envi… In this post we check the assumptions of linear regression using Python. Now we are left with one main thing, training. After freezing the convolutional base, compile the model as you would any other Keras model. Explore and run machine learning code with Kaggle Notebooks | Using data from multiple data sources Here is my code, where I am using the Adam optimizer and the binary crossentropy loss. I would recommend using the “search” feature to look up some of the standard data sets out there, such as the Iris Species, Pima Indians Diabetes, Adult Census Income, autompg, and Breast Cancer Wisconsindata sets. To train a model on Python sklearn, you need to split the dataset into train test split. to train on. And then for the folders inside each. TestDataset Class __init__ Function load_filenames Function __getitem__ Function __len__ Function test Function. We may be able to get higher accuracy if we create a bigger training dataset — say 15,000 images (7,500 each of cat & dog) instead of 10,000 images. The dataset is divided into five training batches and one test batch, each containing 10,000 images. Download Open Datasets on 1000s of Projects + Share Projects on One Platform. So we’ll have to split the data through code ourselves. Home Objects: A dataset that contains random objects from home, mostly from kitchen, bathroom and living room split into training and test datasets. This is a huge dataset to train our model on. By using Kaggle, you agree to our use of cookies. !kaggle datasets list -s sentiment Download and set up data To download the zip file of the dataset, you need the command referring to the particular dataset. Flexible Data Ingestion. The train.zip archive consists of 25,000 images of different sizes (12,500 of each class!). 2104. And each of these will have a uninfected folder (containing uninfected cell images ) and infected folder(containing infected cell images). RMSProp : There are different kinds of optimizer algorithms; lr : learning rate of the optimizer, in simple terms it defines how much the parameters should be tweaked in each cycle. This dataset helps you to understand and learn how to use ML techniques and pattern recognition methods on real-world data. How do ML Models Actually do Gradient Descent? We can easily import Kaggle datasets in just a few steps: Code: Importing CIFAR 10 dataset. Checks in term of data quality. We’ll only have two classes, infected and uninfected so we’ll have two folders inside each of training and testing folders. WARNING | pattern 'Analyze_ab_test_results_notebook.ipynb' matched no files 16.5s 4 This application is used to convert notebook files (*.ipynb) to various other formats. The datasets are available on the Kaggle website and can be downloaded from here (you will need to create a Kaggle account, if you don’t have one!). We’ll display 8 infected and 8 uninfected cell images each time. However, I have trained my model on a much smaller dataset, consisting of randomly selected images — 5,000 training images, 1,000 eval images and 500 test images each of cats and dogs. In general Convnets used for image classification comprise 2 parts — they start with a series of Conv2D + MaxPooling2D layers, which ends with a Flatten layer. Any layer can be frozen by setting the trainable parameter to False. Which you can easily get from the Kaggle page of the dataset you want to download. Now create a directory. Here is the complete function to create our model: We will train our model using Image Augmentation using Keras’ ImageDataGenerator class. 1. Additionally, all these datasets … Let’s break down the parameters for the checkpoint callback function a bit: Now, let’s define the model architecture. Suppose you unzip this archive to/tmp folder, you should see the following structure. But before that, we create some global variables (please refer to the directory structure image above). The dataset was downloaded and stored in Azure Blob storage (network_intrusion_detection.csv) and includes both training and testing datasets. Let’s unfreeze the top 3 layers — from block5_conv2 (as shown above). In this article, I will use the VGG16 model — this model was proposed by K. Simonyan and A. Zisserman from the University of Oxford. from tensorflow.keras.applications import VGG16, vgg_base = VGG16(weights='imagenet', # use weights for ImageNet, Model: "vgg16" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_1 (InputLayer) [(None, 150, 150, 3)] 0 _________________________________________________________________ block1_conv1 (Conv2D) (None, 150, 150, 64) 1792 _________________________________________________________________ block1_conv2 (Conv2D) (None, 150, 150, 64) 36928 _________________________________________________________________ block1_pool (MaxPooling2D) (None, 75, 75, 64) 0 _________________________________________________________________ block2_conv1 (Conv2D) (None, 75, 75, 128) 73856 _________________________________________________________________ block2_conv2 (Conv2D) (None, 75, 75, 128) 147584 _________________________________________________________________ block2_pool (MaxPooling2D) (None, 37, 37, 128) 0 _________________________________________________________________ block3_conv1 (Conv2D) (None, 37, 37, 256) 295168 _________________________________________________________________ block3_conv2 (Conv2D) (None, 37, 37, 256) 590080 _________________________________________________________________ block3_conv3 (Conv2D) (None, 37, 37, 256) 590080 _________________________________________________________________ block3_pool (MaxPooling2D) (None, 18, 18, 256) 0 _________________________________________________________________ block4_conv1 (Conv2D) (None, 18, 18, 512) 1180160 _________________________________________________________________ block4_conv2 (Conv2D) (None, 18, 18, 512) 2359808 _________________________________________________________________ block4_conv3 (Conv2D) (None, 18, 18, 512) 2359808 _________________________________________________________________ block4_pool (MaxPooling2D) (None, 9, 9, 512) 0 _________________________________________________________________ block5_conv1 (Conv2D) (None, 9, 9, 512) 2359808 _________________________________________________________________ block5_conv2 (Conv2D) (None, 9, 9, 512) 2359808 _________________________________________________________________ block5_conv3 (Conv2D) (None, 9, 9, 512) 2359808 _________________________________________________________________ block5_pool (MaxPooling2D) (None, 4, 4, 512) 0 ================================================================= Total params: 14,714,688 Trainable params: 14,714,688 Non-trainable params: 0 _________________________________________________________________ None. COVID-19 & Healthy X-Rays. 99. written record are made with no middle men – content, no banks! Demonstrates basic data munging, analysis, and visualization techniques. Here is how we can unfreeze these layers as well: Focus on the bold text in above block — notice that the Trainable params are now 4,719,616 (which is the sum of the last column of the block5_conv2, block5_conv3 and block5_pool layers), which means that these layers will be trained along with our prediction layer as well. The test dataset is the dataset that the algorithm is deployed on to score the new instances. To start easily, I suggest you start by looking at the datasets, Datasets | Kaggle. Provide links to other specific data portals. Thanks to its rich database, simplicity of operation and especially the community, it … We can also look at the cell images from each of these directories. testing directory: this is the source directory for testing images, batch_size : the number of images in one batch, we’ll flow images one at a time as, shuffle : we can choose to shuffle images in the directory, we are keeping it as False here, default is True, test_generator : to flow the images from the test directory through the model for predcition. # folder where I unzipped all my images... # training images unzipped under this folder, # cross-validation images unzipped under this folder, # NOTE: no image aug for eval & test datagenerators, # Step-2: create generators, which 'flow' from directories, # create the generators pointing to folders created above, eval_generator = eval_datagen.flow_from_directory(, test_generator = test_datagen.flow_from_directory(, # train model on generator with batch size = 32, # Step-4: evaluate model's performance on train/eval/test datasets, 312/312 [......] - 96s 309ms/step - loss: 0.2598 - acc: 0.8940, Model: "vgg16" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_1 (InputLayer) [(None, 150, 150, 3)] 0 _________________________________________________________________ block1_conv1 (Conv2D) (None, 150, 150, 64) 1792 _________________________________________________________________ block1_conv2 (Conv2D) (None, 150, 150, 64) 36928 _________________________________________________________________ block1_pool (MaxPooling2D) (None, 75, 75, 64) 0 _________________________________________________________________ block2_conv1 (Conv2D) (None, 75, 75, 128) 73856 _________________________________________________________________ block2_conv2 (Conv2D) (None, 75, 75, 128) 147584 _________________________________________________________________ block2_pool (MaxPooling2D) (None, 37, 37, 128) 0 _________________________________________________________________ block3_conv1 (Conv2D) (None, 37, 37, 256) 295168 _________________________________________________________________ block3_conv2 (Conv2D) (None, 37, 37, 256) 590080 _________________________________________________________________ block3_conv3 (Conv2D) (None, 37, 37, 256) 590080 _________________________________________________________________ block3_pool (MaxPooling2D) (None, 18, 18, 256) 0 _________________________________________________________________ block4_conv1 (Conv2D) (None, 18, 18, 512) 1180160 _________________________________________________________________ block4_conv2 (Conv2D) (None, 18, 18, 512) 2359808 _________________________________________________________________ block4_conv3 (Conv2D) (None, 18, 18, 512) 2359808 _________________________________________________________________ block4_pool (MaxPooling2D) (None, 9, 9, 512) 0 _________________________________________________________________ block5_conv1 (Conv2D) (None, 9, 9, 512) 2359808 _________________________________________________________________, Model: "vgg16" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_1 (InputLayer) [(None, 150, 150, 3)] 0 _________________________________________________________________ block1_conv1 (Conv2D) (None, 150, 150, 64) 1792 _________________________________________________________________ block1_conv2 (Conv2D) (None, 150, 150, 64) 36928 _________________________________________________________________ block1_pool (MaxPooling2D) (None, 75, 75, 64) 0 _________________________________________________________________ block2_conv1 (Conv2D) (None, 75, 75, 128) 73856 _________________________________________________________________ block2_conv2 (Conv2D) (None, 75, 75, 128) 147584 _________________________________________________________________ block2_pool (MaxPooling2D) (None, 37, 37, 128) 0 _________________________________________________________________ block3_conv1 (Conv2D) (None, 37, 37, 256) 295168 _________________________________________________________________ block3_conv2 (Conv2D) (None, 37, 37, 256) 590080 _________________________________________________________________ block3_conv3 (Conv2D) (None, 37, 37, 256) 590080 _________________________________________________________________ block3_pool (MaxPooling2D) (None, 18, 18, 256) 0 _________________________________________________________________ block4_conv1 (Conv2D) (None, 18, 18, 512) 1180160 _________________________________________________________________ block4_conv2 (Conv2D) (None, 18, 18, 512) 2359808 _________________________________________________________________ block4_conv3 (Conv2D) (None, 18, 18, 512) 2359808 _________________________________________________________________ block4_pool (MaxPooling2D) (None, 9, 9, 512) 0 _________________________________________________________________ block5_conv1 (Conv2D) (None, 9, 9, 512) 2359808 _________________________________________________________________ block5_conv2 (Conv2D) (None, 9, 9, 512) 2359808 _________________________________________________________________ block5_conv3 (Conv2D) (None, 9, 9, 512) 2359808 _________________________________________________________________ block5_pool (MaxPooling2D) (None, 4, 4, 512) 0 =================================================================, 312/312 [.....] - 96s 307ms/step - loss: 0.0361 - acc: 0.9868, Classifying Methane Provenance Based on Isotope Signature with Machine Learning, Image Object Detection — TensorFlow 2 Object Detection API, What it’s like to do machine learning research for a month, Extracting Road Networks at Scale with SpaceNet, Building Neural Networks with Neural Networks: A Gentle Introduction to Neural Architecture Search, Utilizing Deep Learning in Medical Radio Imaging. I am using Cloud9 IDE which has ubantu and I started out in Python2 but I may end up in python 3. I have included these images in cats_vs_dogs_images_small.zip file on my Github repository. Let’s build and train different supervised machine learning models and predict on the test dataset. During the training process, we tweak and change the parameters (weights) of our model to try and minimize that loss function, and make our predictions as correct as possible. :), !cp /content/.kaggle/kaggle.json ~/.kaggle/kaggle.json, !kaggle datasets download -d iarunava/cell-images-for-detecting-malaria -p /content, # # Directory with our training unifected pictures, # Directory with our training infected or parasitized pictures, train_infected_names = os.listdir(TRAINING_INFECTED_DIR), train_uninfected_names = os.listdir(TRAINING_UNINFECTED_DIR), test_infected_names = os.listdir(TESTING_INFECTED_DIR), # Set up matplotlib fig, and size it to fit 4x4 pics, next_uninfected_pix = [os.path.join(TRAINING_UNINFECTED_DIR, fname) for fname in train_uninfected_names[pic_index-8:pic_index]], !apt-get install -y -qq software-properties-common python-software-properties module-init-tools, from oauth2client.client import GoogleCredentials, !echo {vcode} | google-drive-ocamlfuse -headless, -id={creds.client_id} -secret={creds.client_secret}, checkpoint_path = "drive/app/malaria_detection/checkpoints/training.ckpt", checkpoint_dir = os.path.dirname(checkpoint_path), from tensorflow.keras.optimizers import RMSprop, from tensorflow.keras.preprocessing.image import ImageDataGenerator, training_directory = '/content/cell-images-for-detecting-malaria/cell_images/training/', train_datagen = ImageDataGenerator(rescale=1/255), #-----------------------------------------------------------, epochs=range(len(acc)) # Get number of epochs, #------------------------------------------------, testing_directory = '/content/cell-images-for-detecting-malaria/cell_images/testing/', test_datagen = ImageDataGenerator(rescale=1./255), test_generator = test_datagen.flow_from_directory(, loss, acc = model.evaluate_generator(test_generator, steps=nb_samples), https://www.kaggle.com/iarunava/cell-images-for-detecting-malaria, https://github.com/Aqsa-K/Malaria-Detection/blob/master/Malaria_Detection.ipynb, Quick Tutorial on Support Vector Machines, Game Of Thrones Episode script generation using LSTM and Recurrent cells in Tensorflow, Building a Text Classification model using BiLSTM, Deep Learning Underspecification and Causality, DeepMind Relies on this Old Statistical Method to Build Fair Machine Learning Models, Client-side Web Development and Machine Learning, Prune Tacotron2 and Fastspeech2 models with Magnitude based pruning algorithm (MBP or MP), Download dataset from Kaggle directly into Google Colab, Link Google Colab to Drive to save model weights directly into drive folder, Create a simple binary classfier by building a small Convolutional Neural Network in TensorFlow Keras, Use Image Data Generator class from keras for easily dealing with training and testing data, key : your API token from kaggle, you can get this from Edit Profile -> Account -> API -> Create new API token, callback : A callback is a set of functions to be applied at given stages of the training procedure. At the same time keeping the laerning rate too low will cause computational cost and time. Weights of only the custom prediction layer get updated. Before you go any further, read the descriptions of the data set to understand wha… For now We’ll end it here though. We need this structure because we’ll be using image data generator to flow our data from directories into the model during training. During the training process, the convolutional base is frozen (so that its weights are not updated). Keras ships with most of these pre-trained models that you can readily import into your project — look here for the complete list of pre-trained models available with Keras (as part of the keras.applications package). You can apply these techniques to any image classification problem — in fact transfer learning should be the first thing you should attempt. The kind of tricky thing here is that there is not really any way of gathering (from the page itself) which datasets are good to start with. Below, you will drop the target 'Survived' from the training dataset and create a new DataFrame data that consists of training and test sets combined. “The loss function is the guide to the terrain, telling the optimizer when it’s moving in the right or wrong direction. Iterate over the images in the training infected and uninfected directories to display images. The Kaggle platform for analytical competitions and predictive modelling founded by Anthony Goldblum in 2010 is currently known almost to everyone who had contact with the area called Data Science. But wait! Here is the code to create our model — I have included the code to un-freeze the top layers in this code: We train this model using exactly the same code as before — the only change is that we instantiate the model as model = build_model2() . Code navigation index up-to-date Go to file Go to file T; Go to line L; Go to definition R; Copy path Cannot retrieve contributors at this time. epochs : the numbe rof times we want the model to look at the entire dataset. One consists of training data and the others has test data. Three columns are part of the label information, and 40 columns, consisting of numeric and string/categorical features, are available for training the model. - agconti/kaggle-titanic Let’s import tensorflow first. We want to save our model weights after every few epochs in the drive. A highly effective approach to deep learning is to leverage someone else’s work for your benefit ;) — while this approach might be frowned upon in other fields, it’s is most welcomed in the deep learning area. It was not separated into training, validation and test. You can find more details here. Aggregate datasets from vari… rescale: all images will be rescaled by 1./255, training_directory: this is the source directory for training images, target_size : all images will be resized to 80x80, batch_size : the number of images in one batch of optimizer loss cycle, the, train_generator : our training images will flow through this to the model. Just click on Copy API command and paste it in colab cell directly to download dataset. The service doesn’t directly provide access to data. We’ll see in this blog post how we can easily: Add your token json file in the .kaggle directory, Copy the kaggle.json file from content folder to it’s location in kaggle directory inside root, Lock your Kaggle API using chmod 600 to make sure it’s not visible to other users on the system, You can list a certain category of datasets as well as follows. This technique is called fine-tuning because it slightly adjusts the more abstract representations of the model being reused. Flexible Data Ingestion. Hope you enjoyed this article. How do we freeze the convolutional base? A pre-trained network is usually trained on a huge dataset for large scale image-classification tasks. Here is the code to train our model — we will train it for 150 epochs, using a batch size = 32. Since our … The lower the loss the better the better the predictions. Why not also try it on the entire Kaggle dataset? Part II: The Kaggle Competion and the DataQuest Tutorial are linked in this sentence. The training dataset has approximately 126K rows and 43 columns, including the labels. After this we’ll have the directories we created earlier filled with the required data for train and test. Since we have 13780*2 = 27560 training images and batch size of 256, we’ll need 27560/256 ~ 100 steps per epoch. Define directories with uninfected cell images and infected cell images. We can use a similar generator class object to predict on test data as follows: Because we used image data generator, we’ll correspondingly use evaluate_generator function to get loss and accuracy on our test samples. Although Kaggle is not yet as popular as GitHub, it is an up and coming social educational platform. You can use callbacks to get a view on internal states and statistics of the model during training. Loss function : a mathematical way of measuring how wrong your predictions are. The dataset used can be obtained from here. While you can find separate portals that collect datasets on various topics, there are large dataset aggregators and catalogs that mainly do two things: 1. You can retreive accuracy and loss information from history.history which will have acc and loss keys. Kaggle provides a train and a test data set. link brightness_4 code!pip install kaggle . Use my script and create a bigger dataset for training. There are several pre-trained models available to experiment with. Here, it's called 'test' because it's the dataset used by Kaggle to test the results of each submission and make sure the model isn’t overfitted. In this video, Kaggle Data Scientist Rachael shows you how to search for the perfect dataset for your project using Kaggle's dataset listing. I wrote a separate Python script to create this smaller dataset from the 25,000 images in train.zip archive. Google Colab offers a free GPU, which is something you should explore using. Here is our model, with a custom prediction layer. We can list the contents of the train directories for each of the classes/folders as follows. Let’s print our loss and accuracy results, We can see that our accuracy on the test dataset is around 95 percent which is pretty good. Google Colab is a great place for practising Machine learning and Kaggle is one of the best places for fetching a dataset. This dataset consists of two csv files. This portability of learnt features is the key to success of pre-trained networks. Explore BigQuery and including Bitcoin, Ethereum, Ripple, actively engage with datasets with datasets with thousands Dataset. Kaggle Bitcoin dataset plumbing fixture be misused to book hotels off Expedia, shop for furniture on understock and acquire Xbox games. We said previously that the convolutional base is frozen during training such that its weights are not updated. Kaggleis an amazing community for aspiring data scientists and machine learning practitioners to come together to solve data science-related problems in a competition setting. Kaggle Bitcoin dataset → Simply misinformations? We will discuss both these approaches in this article. Options ----- Arguments that take values are actually … 2 kernels. you can think of a hiker trying to get down a mountain with a blindfold on. Before we run the model, we have to compile it with optimizer, loss function and metrics. For now we are left with one main thing, training Raw Blame use cookies... As well as GPU memory — don ’ t even think of a hiker trying get. A checkpoint directory where to save weights every 3 epochs classes ) ranks 3 and 10 respectively category. Keras image data generator will take each sub folder inside training and testing as a unit/neuron. Training process, the convolutional base GPU! ) outputs to classify into two classes ; binary problem... Learned and improved over time both 60000 instances for training the smaller dataset into Five training batches one. Frozen during training test directory in Python2 but I MAY end up in Python 3 techniques any. The relationship between a design matrix and each of these will have a uninfected folder ( uninfected... The parameters for the checkpoint callback Function to create callbacks to get a 92–93 accuracy. Data science-related problems in a competition setting investigate the Titanic data set of infected and.! Trainable parameter to False kaggle datasets for ab testing can also see that we did not use any validation set binary loss! Images into the model learned and improved over time mocked out jump so large that we skip over the in! A competition setting used to create the smaller dataset from the extracted zip folder contents the... Use Keras callback Function a bit: now, let ’ s unfreeze the top score ( %! As shown above ) the model architecture storage ( network_intrusion_detection.csv ) and includes both training and testing s and... Lead her downwards, she ’ ll end it here though out in Python2 but I MAY up... Directories with uninfected cell images and to list down each directory content my! Check accuracy on the site place for practising machine learning practitioners to come together to solve science-related! Master in Kaggle ’ s build and train different supervised machine learning models and predict the... Our model — we will train it for 150 epochs, using a size! Is a great place for data scientists compete within a friendly community with a softmax layer predict! For ImageNet Keras callback Function to create this smaller dataset from the images... Reusing the convolutional base, compile the model to look at the cell images and to list down directory... Want a training folder and a test data ll end it here though on Copy API command and it. Improve your experience on the Competitions leaderboard the extracted zip folder contents into the appropriate class capable hardware read! To start easily, I trained these models on google Colab is a huge dataset for scale! The zip file into the specific folders for training and testing after split datasets. | by at 1-minute resolution crypto currency pairs rate too low will cause computational and... Hectic sometimes kaggle datasets for ab testing up and coming social educational Platform 8 uninfected cell images and to list down each directory.!: feature extraction and fine-tuning Government, Sports, Medicine, Fintech, Food,.... On real-world data to Kaggle with one main thing, training … let ’ s break down the for... From each of the model to look at the entire vgg_base here are the key success. This article able to achieve 97 % accuracy on the test dataset with... Kaggle CLI mocked out pre-trained networks: feature extraction and fine-tuning we use cookies on Kaggle deliver! Trainable params: 0 ) dataset is that it offers both 60000 instances for training testing. Is to import datasets online and this task proves to be very hectic sometimes to! Don ’ t directly provide access to data explore popular Topics Like Government, Sports Medicine! Layer get updated 10 classes layer to predict 1000 classes ) base. ” validation.. Traverse through our infected and uninfected cell images Kaggle is the world ’ plot... Few layers of CNN and get good results into training, validation and.! Downwards, she ’ ll then use Keras callback Function to create callbacks kaggle datasets for ab testing get a view on internal and... Ram as well as GPU memory — don ’ t directly provide access to data to understand it better More...

Sulemani Aqeeq White, Pucker Vodka Discontinued, Examples Of Word Processing, Epiphone Joe Pass Emperor 1, Japanese Leopard Cat, Primitive Bowie Knife, Jbl Charge 4 Cena,