COS402 Program P6 results for ocr49 sorted by error


View results for: census dna ocr17 ocr49
Resort by: author date error

AuthorDescriptionDate submitted% Error rate
tenrburritoAdaBoost algorithm with Decision Trees as weak learning algorithmTue Jan 14 16:38:21 20141.4 
skarpAdaBoost with decision trees as the weak learner (chosen to minimize the entropy, where each tree is restricted to a maximum depth of 5) and 200 rounds of boosting.Tue Jan 14 12:38:06 20141.6 
Tiny WingsAdaBoost with decision tree algorithm as weak learner (maximum depth of decision trees = 5, chi-square pruning significancel level = 0.01, # of AdaBoost rounds = 200)Mon Jan 6 05:07:34 20141.7 
ebp and WafflepocalypseAdaboost on random forests of 30 trees, sampling .65 of the weighted training data with replacement for each hypothesis, 150 rounds of boosting.Sat Jan 11 02:46:53 20141.7 
dericc, sigatapu200-iteration AdaBoost with Decision TreesMon Jan 13 16:45:48 20141.7 
DHAdaBoost with decision trees (max depth = 5), 200 iterationsSun Jan 5 20:26:45 20141.8 
SkyNet1000-iteration AdaBoost with Decision StumpThu Jan 9 21:22:55 20141.8 
KiwisAdaBoost with decision trees. Number of iterations: 150. Max depth for tree: 4.Sun Jan 12 15:46:24 20141.8 
skarpAdaBoost with decision trees as the weak learner (chosen to minimize the entropy, where each tree is restricted to a maximum depth of 4) and 200 rounds of boosting.Tue Jan 14 12:29:29 20141.8 
George and KatieRandom Forests implemented using vanilla Decision Trees and customizable depth, tree size, and bootstrap size.Thu Jan 9 20:27:29 20141.9 
TaurielRandomForest w/ DecisionTreesSat Jan 4 16:30:35 20142.2 
Aaron DollThis is an implementation of the random forest algorithmFri Jan 10 14:50:20 20142.2 
TaurielRandomForest w/ DecisionTreesSun Jan 12 15:40:12 20142.2 
TaurielRandomForest w/ DecisionTrees pruned at significance level 0.95Sun Jan 12 15:51:47 20142.2 
Aaron DollThis is an implementation of the random forests with m=1, 400 treesFri Jan 10 22:18:09 20142.3 
S1Random forests with decision trees (500 trees).Sat Jan 11 20:12:48 20142.3 
smAdaBoost, using pruned Decision Trees as the weak learner.Mon Jan 13 15:35:01 20142.3 
TaurielRandomForest w/ DecisionTreesSun Jan 12 15:23:34 20142.4 
FannyA voted perceptron algorithm (epoch = 10)Sun Jan 12 22:58:47 20142.4 
smAdaBoost, using pruned Decision Trees as the weak learner.Thu Jan 9 19:13:02 20142.5 
FannyA voted perceptron algorithm (epoch = 30)Wed Jan 8 13:14:08 20142.6 
KiwisAdaBoost with decision trees. Number of iterations: 100. Max depth for tree: 4.Sun Jan 12 16:18:27 20142.6 
S1An implementation of AdaBoost, with pruned decision trees as the weak learner for the algorithm and 500 rounds of boosting.Mon Jan 13 16:46:26 20142.6 
GlennBackpropagation performed on a neural network with 1 hidden layers for 50 iterations. The learning rate was set to 1.0 and the layers (from input to output) contain [ 197 51 1 ] units, including a bias unit for each non-output layer.Tue Jan 14 12:55:42 20142.7 
anon5An implementation of the AdaBoost algorithm using decision trees with pruning as the learner with 200 rounds of boostingFri Jan 3 20:00:03 20142.8 
anon5An implementation of the AdaBoost algorithm using vanilla decision trees as the learner with 200 rounds of boostingFri Jan 3 20:15:31 20142.8 
CookieMonsterThis is a nearest-neighbor classifier which takes a majority vote from the k nearest points in feature space using Euclidean Distance.Thu Jan 9 22:57:24 20142.8 
Andrew WernerAdaBoost using vanilla decision trees as the weak learnerSat Jan 4 15:01:34 20142.9 
Janie GuAdaBoost algorithm with decision trees as the weak learner (with a random subset of training examples selected each round by resampling).Mon Jan 6 15:26:52 20142.9 
GlennBackpropagation performed on a neural network with 1 hidden layers for 100 iterations. The learning rate was set to 0.1 and the layers (from input to output) contain [ 197 51 1 ] units, including a bias unit for each non-output layer.Tue Jan 14 13:06:55 20142.9 
AFCAn implementation of K nearest neighbors with empirically optimized K values.Thu Jan 2 20:59:59 20143.0 
WafflepocalypseA random forest classifier with 1001 trees.Thu Jan 9 04:53:45 20143.0 
haoyuAdaBoost with Single Layer Neural NetworkThu Jan 2 17:46:27 20143.2 
Sunnannanearest neighbors algorithm with k = 7Mon Jan 13 14:49:40 20143.2 
bchouNearest 7-neighborsFri Jan 3 20:02:09 20143.3 
CookieMonsterThis is a bagging algorithm which uses a nearest neighbor algorithm as its weak classifier.Thu Jan 9 23:13:29 20143.3 
R.A.B.k-NN with K = 21 and votes weighted by the inverse of distanceTue Jan 14 00:03:23 20143.4 
finn&jakeKnn, K=5, Euclidean distance for numeric and standardized distance for discrete variables; majority vote for nearest neighbors.Tue Jan 14 13:22:50 20143.4 
Andrew WernerAdaBoost using vanilla decision trees as the weak learnerThu Jan 2 22:55:08 20143.5 
JordanAdaboost to create a new feature space, then KNNSun Jan 5 21:17:16 20143.5 
RockyNearest Neighbors with weighted vote(weight is inversely proportional to distance), Manhattan distance for normalized attributes, linear scan all examples to find K nearest neighbors(not good for very large training set)Thu Jan 9 10:11:26 20143.6 
mdrjrThis is an implementation of k nearest neighbors. I've played around with both k and the distance function.Thu Jan 9 20:04:54 20143.7 
R.A.B.K nearest neighbors with k = 20Thu Jan 9 22:26:05 20143.7 
The Whitman WhaleNearest neighbor classification with 17 neighbors and manhattan distanceThu Jan 9 22:48:28 20143.7 
liltAn implementation of 10-nearest neighborsThu Jan 9 00:23:16 20143.8 
CookieMonsterThis is a bagging algorithm which uses a nearest neighbor algorithm as its weak classifier.Mon Jan 6 20:47:22 20143.9 
David Hammerbagging (using binary decision trees)Sun Jan 5 12:29:43 20144.3 
Andra Constantinescu and Bar ShabtaiRandom Forest with number of trees, N and M optimized for each dataset!Tue Jan 14 07:12:38 20144.3 
NYRandom forest with 500 iterationsWed Jan 8 16:53:41 20144.4 
finn&jakeKnn, K=40, Euclidean distance for numeric and standardized distance for discrete variables; majority vote for nearest neighbors.Thu Jan 9 14:13:25 20144.4 
Ravi TandonThis algorithm is the implementation of the bootstrap aggregation algorithm.Mon Dec 23 18:40:45 20134.5 
hbKNN with L2 distance, k empirically set after cross validationTue Jan 14 14:50:47 20144.5 
AFCAn (initial) implementation of K nearest neighbors with K = sqrt(number of training samples).Thu Jan 2 16:52:21 20144.7 
hbKNN with L2 distance, k empirically set after cross validationThu Jan 9 22:52:35 20144.7 
ASappNearest Neighbor Algorithm with k = 5. Normalizes using mean and standard deviation of each attribute.Sat Jan 11 01:33:38 20144.7 
K.L.Single layer neural net, 100 training rounds, learning rate = .01.Tue Jan 7 15:36:37 20144.8 
bfangSingle layer neural network, 80 epochs, alpha=0.01Sun Jan 12 13:49:39 20144.8 
Boar CiphersImplements a single-layer neural network with 100 epochs and a 0.001 learning rateTue Jan 14 10:53:40 20144.8 
RockyBagging algorithm with single layer neural network as the weak learnerTue Jan 14 15:01:01 20144.8 
JSSingle-layer Neural Network, 100 epochs, Learning Rate = 0.001Tue Jan 14 15:33:47 20144.8 
anonAdaBoost (using shallow binary decision trees as weak learner)Sun Jan 5 12:23:54 20144.9 
GewangPredicts the classification label based on the k nearest neighborsMon Jan 6 23:16:30 20144.9 
asdfA single perceptron (using a logistic threshold) with a learning rate of 0.001 and 100 epochs of training.Thu Jan 9 23:08:08 20144.9 
bfangSingle layer neural network, 80 epochs, alpha=0.01Fri Jan 10 12:43:10 20144.9 
Andra Constantinescu and Bar ShabtaiVanilla single layer neural network algorithm. Takes binary input and loops through all training examples to update the weights of each attribute. Alpha and epochs optimized for each dataset.Tue Jan 14 02:32:27 20144.9 
hbAdaBoost, KNN as week learner, k chosen empiricallyTue Jan 14 16:02:40 20144.9 
ECNeural NetTue Jan 7 18:40:44 20145.0 
Sunnannasingle-layer feedforward neural net usinglogistic functionMon Jan 13 16:18:07 20145.0 
GewangPredicts the classification label based on the k nearest neighborsFri Jan 3 14:07:55 20145.1 
bclamSingle-layer Neural Network - 2000 epochs, Learning rate of 0.01Tue Jan 14 07:12:01 20145.1 
haoyuSingle layer neural networkMon Dec 30 15:37:54 20135.2 
JSSingle-layer Neural Network, 200 epochs, Learning Rate = 0.01Thu Jan 9 15:36:09 20145.2 
Dr. RobertoADABoost with 50 rounds of Single Layer Neural Net run for 100 epochs with a varying learning value of around 0.01Thu Jan 9 16:37:27 20145.2 
Katie and GeorgeAn implementation of the (voted) perceptron algorithm run for 25 epochs.Thu Jan 9 20:05:50 20145.2 
CCAdaBoost with Neural Networks as the learner. Uses the percentage with the highest weights of the data to make the hypothesis on a given roundTue Jan 14 11:13:03 20145.2 
Katie and GeorgeAn implementation of the (voted) perceptron algorithm run for 100 epochs.Mon Jan 6 21:04:33 20145.3 
bclamSingle-layer Neural Network - 10000 epochs, Learning rate of 0.05Tue Jan 14 07:20:25 20145.3 
Stephen McDonaldA K-nearest neighbours algorithm that predicts a test example by taking a majority vote of the k nearest neighbours, as measured by Manhattan distance (k is set to 1 for this trial). Additionally, this algorithm first converts the attribute types to numeric and normalizes each attribute to have zero mean and unit variance.Wed Jan 8 18:29:05 20145.4 
ValyaImplements a single layer neural net, much like the algorithm used in W6, with alpha = 0.01, running for 1000 epochs.Mon Jan 13 22:06:10 20145.4 
skarpAdaBoost with decision stumps as the weak learner (chosen to minimize the weighted training error) and 300 rounds of boosting.Tue Jan 14 12:07:17 20145.4 
skarpAdaBoost with decision stumps as the weak learner (chosen to minimize the weighted training error) and 200 rounds of boosting.Sat Dec 28 16:13:58 20135.5 
GewangPredicts the classification label based on the k nearest neighborsThu Jan 2 14:44:12 20145.5 
CTTTAdaboost + Decision Stumps (200 rounds).Mon Jan 6 20:44:55 20145.5 
Mickey MouseA implementation of AdaBoost withDecision Stump as the weak learner and 200 rounds of boostingTue Jan 7 15:44:27 20145.5 
bclamSingle-layer Neural Network - 1000 epochs, Learning rate of 0.05Wed Jan 8 22:16:33 20145.5 
lolzAdaboost with decision stumps as the weak learner algorithm (k = 200)Thu Jan 9 00:39:23 20145.5 
SAJEADABOOSTThu Jan 9 15:32:14 20145.5 
Chuck and LarryAdaBooooooost!!! using binary decision stumps with 200 rounds of boostingThu Jan 9 18:04:16 20145.5 
CharliezscAdaboost with Decision StumpsThu Jan 9 18:21:10 20145.5 
WumiAdaBoost with Decision Stump and 200 boostsThu Jan 9 20:52:31 20145.5 
Mr. BlobbyAdaBoost (200 rounds) with decision stumpsThu Jan 9 23:32:26 20145.5 
jabreezyAdaboost with decision stumps (boosted 200 roundsThu Jan 9 23:49:49 20145.5 
hian implementation of AdaBoost woo hooThu Jan 9 23:57:10 20145.5 
MariusAdaBoost using decision stumps as the weak-learning algorithms. It is run for 200 iterations.Sun Jan 12 14:09:03 20145.5 
Hello!AdaBoost with Decision StumpsSun Jan 12 23:21:29 20145.5 
George and KatieRandom Forests implemented using vanilla Decision Trees and customizable depth, tree size, and bootstrap size.Wed Jan 8 23:01:13 20145.6 
Dr. RobertoSingle layer Neural Net run for 100 epochs with a learning value 0f 0.01Thu Jan 9 13:39:04 20145.6 
meAdaboost using decision stumps and 400 rounds of boosting.Thu Jan 9 23:18:51 20145.6 
MacrameAdaboost, decision stumps, 250 roundsTue Jan 14 01:55:44 20145.6 
Andra Constantinescu and Bar ShabtaiAdaBoost on a single layer neural network The neural classifier takes binary input and loops through all training examples to update the weights of each attribute. Number of boosting rounds optimized for data set (here 2)Tue Jan 14 16:23:34 20145.6 
dlackeyThis is an implementation of AdaBoost that uses 175 rounds of boosting. The weak learning algorithm used is a decision stump that directly minimizes the weighted training error.Wed Jan 8 14:10:54 20145.7 
S1An implementation of AdaBoost, with decision stumps as the weak learner for the algorithm and 500 rounds of boosting.Thu Jan 9 15:22:39 20145.7 
SAJEADABOOSTThu Jan 9 15:35:47 20145.7 
Chuck and LarryPerceptron neural networkThu Jan 9 18:00:04 20145.7 
RockyAdaBoost algorithm with the decision stumps as the weak learnerThu Jan 9 20:49:05 20145.7 
FannyAn ensemble learning algorithm that consists of AdaBoost using decision stumps as weak learner.Sun Jan 12 04:25:12 20145.7 
SupahakaAdaBoost with Decision Stumps with 500 rounds of boosting.Tue Jan 14 01:05:08 20145.7 
bchouAdaBoost on Binary Decision Stumps. 150 rounds of BoostingFri Jan 3 07:09:15 20145.8 
Mickey MouseAn Implementation of Voted Perceptron algorithm with 200 epochsWed Jan 8 15:52:53 20145.8 
CaligulaAn implementation of AdaBoost with decision stumps and 800 rounds of boosting.Wed Jan 8 19:46:23 20145.8 
Anna Ren (aren) and Sunny Xu (ziyangxu)<BR><BR>SunnannaAdaboost using 150 rounds of boosting and decision stumps as a weak learnerThu Jan 9 22:39:29 20145.8 
GlennBackpropagation performed on a neural network with 1 hidden layers for 3000 iterations. The learning rate was set to 0.1 and the layers (from input to output) contain [ 197 4 1 ] units, including a bias unit for each non-output layer.Thu Jan 9 23:20:33 20145.8 
Mr. BlobbyAdaBoost (150 rounds) with decision stumpsFri Jan 10 20:22:29 20145.8 
Green Gmoney ChoiThis is an implementation of the AdaBoostalgorithm with decision stumps.Sun Jan 12 15:10:36 20145.8 
RockyAdaBoost algorithm with the decision stumps as the weak learner, T=150Mon Jan 13 21:17:55 20145.8 
R.A.B.Adaboost on decision stumps 150 roundsTue Jan 14 03:15:28 20145.8 
CCAdaBoost with Decision Stump for 150 rounds.Tue Jan 14 10:51:51 20145.8 
JordanThe Adaboost Algorithm with 2000 Decision StumpsSun Jan 5 15:21:07 20145.9 
weezyImplements AdaBoost using decision stumps as a weak learner and running for 1000 rounds of boosting.Wed Jan 8 23:17:50 20145.9 
SkyNet1000-iteration AdaBoost with Decision StumpThu Jan 9 00:20:25 20145.9 
CCAdaBoost with Decision Stump for 1000 rounds.Thu Jan 9 14:40:01 20145.9 
Dr. RobertoADABoost with 100 rounds of Single Layer Neural Net run for 10 epochs with a varying learning value of around 0.01Thu Jan 9 14:51:08 20145.9 
ytterbiumAdaBoost with decision stumps. (1000 rounds)Thu Jan 9 20:00:48 20145.9 
Cam PorterA version of the AdaBoost learning algorithm that uses decision stumps as a weak learning base.Thu Jan 9 22:47:38 20145.9 
dmmckenn_pthorpeImplements Adaboost with 1,000 rounds of boosting with decision stumps as the weak learner.Thu Jan 9 23:32:58 20145.9 
Mr. BlobbyAdaBoost (1000 rounds) with decision stumpsFri Jan 10 20:11:35 20145.9 
CCAdaBoost with Neural Networks as the learner. Uses the percentage with the highest weights of the data to make the hypothesis on a given roundMon Jan 13 16:07:10 20145.9 
FannyAn ensemble learning algorithm that consists of AdaBoost using decision stumps as weak learner.Mon Jan 13 21:50:32 20145.9 
R.A.B.Adaboost on decision stumps 1000 roundsTue Jan 14 02:04:14 20145.9 
CCAdaBoost with Neural Networks as the learner. Uses the percentage with the highest weights of the data to make the hypothesis on a given roundTue Jan 14 11:41:07 20145.9 
Dr. Steve Brule (For Your Health)Neural Network trained for 100 epochs.Tue Jan 14 15:37:19 20145.9 
bfangBoosting with decision stumps and early stoppingSun Dec 29 23:30:13 20136.0 
anon5An implementation of the AdaBoost algorithm using decision stumps as the learner with 200 rounds of boostingThu Jan 2 15:28:13 20146.0 
JgsAn implementaiton of AdaBoost with Decision Stumps that is optimized by only using the best possible decision stump for each attribute. Rounds of boosting = 250Thu Jan 9 23:27:29 20146.0 
0108Adaboost with decision stump as weak learnerSun Jan 12 14:22:07 20146.0 
Jake BarnesSingle layer artificial neural network with 125 rounds of training. Learning rate is 0.1Wed Jan 8 16:47:50 20146.1 
JgsAn implementation of AdaBoost with Decision Stumps that is optimized by only using the best possible decision stump for each attribute. Rounds of boosting = 150Sat Jan 11 13:56:33 20146.1 
Wu-Tang DynastyAdaBoost using random sampling and Nearest NeighborsMon Jan 13 22:30:59 20146.1 
qshenAn implementation of AdaBoost that uses a weak learner that chooses the decision stump that minimizes the weighted training error and is iterated 500 times.Mon Dec 30 13:58:28 20136.2 
bfangBagging with decision treesWed Jan 1 11:30:09 20146.3 
ebpAdaboost with decision stumps minimizing smoothed weighted training error, 100 rounds of boosting.Mon Jan 6 21:02:21 20146.3 
NYBagged Decision Trees with 500 treesWed Jan 8 16:58:46 20146.3 
Dr. Steve Brule (For Your Health)Neural Network.Thu Jan 9 17:28:18 20146.3 
KiwisAdaBoost with decision stumps and 110 iterations.Fri Jan 10 07:39:58 20146.3 
GewangPredicts the classification label based on the k nearest neighborsThu Jan 2 13:38:33 20146.4 
bfangBoosting with decision stumps (100 rounds)Fri Jan 3 21:10:37 20146.4 
CCAdaBoost with Decision Stump for 5000 rounds.Thu Jan 9 14:49:58 20146.4 
Dr. Steve Brule (For Your Health)Neural Network.Sun Jan 12 19:58:48 20146.4 
Wu-Tang DynastyAdaBoost using random sampling and Decision TreesMon Jan 13 21:22:25 20146.4 
FannyAn ensemble learning algorithm that consists of AdaBoost using decision stumps as weak learner.Sat Jan 4 06:55:07 20146.5 
JamehA learning algorithm using Adaboost along with decision stumps to determine a classifier to use in future test cases. Give a BinaryDataSet and number of rounds for boosting.Tue Jan 7 22:52:25 20146.5 
BPMI implemented AdaBoost with binary decision stumps and 100 rounds of boosting.Wed Jan 8 22:42:32 20146.5 
SAJEADABOOSTThu Jan 9 15:27:21 20146.5 
SAJEADABOOSTThu Jan 9 15:43:16 20146.5 
SAJEADABOOST 2Thu Jan 9 15:48:58 20146.5 
John WhelchelBasic implementation of AdaBoost using decision stumps as weak learners and 100 rounds of boosting.Mon Jan 13 20:42:40 20146.5 
skarpAdaBoost with decision stumps as the weak learner (chosen to minimize the weighted training error) and 100 rounds of boosting.Tue Jan 14 12:12:07 20146.5 
hbAdaBoost, basic decision stumpsTue Jan 14 15:14:42 20146.5 
hbAdaBoost, KNN as week learner, k chosen empiricallyTue Jan 14 16:33:16 20146.5 
George and KatieA simple implementation of decision trees as per R&N.Wed Jan 8 21:45:23 20146.6 
Mike HonchoAdaboost implementationThu Jan 9 21:47:00 20146.6 
Mr. BlobbyAdaBoost (200 rounds) with decision trees (depth limit of 5)Sun Jan 12 06:13:26 20146.6 
TaurielAdaBoost w/ DecisionTreeSun Jan 12 14:11:25 20146.7 
Epic HarborsAdaboost with decision stumps as the weak learner and 250 rounds of boostingThu Jan 9 15:00:12 20146.8 
B&YWe use the voted-perceptron algorithm. It runs repeatedly on each training set until it finds a prediction vector which is correct on all examples. We keep track of the survival times for each new prediction vector. These weights help us make a final binary prediction using a weighted majority vote.Thu Jan 9 23:14:49 20146.8 
Aaron DollDecision tree with reduced error pruning.Thu Dec 26 16:45:49 20136.9 
Tiny WingsDecision tree with chi-square pruning (pruning significance level = 0.01)Mon Jan 6 05:05:36 20146.9 
CCAdaBoost with Decision Stump for 50 rounds.Mon Jan 13 21:31:50 20147.0 
SupahakaAdaBoost with Decision Stumps with 100000 rounds of boosting.Mon Jan 13 00:38:32 20147.1 
Andra Constantinescu and Bar ShabtaiVanilla single layer neural network algorithm. Takes binary input and loops through all training examples to update the weights of each attribute. Alpha = 0.1. Very nice, I like!Thu Jan 9 21:00:47 20147.2 
LKBagging with AdaBoost that uses decision stumpsTue Jan 14 12:19:21 20147.2 
LKBagging with AdaBoost that uses decision stumpsMon Jan 6 08:37:39 20147.3 
Hello!AdaBoost with Naive BayesTue Jan 7 15:09:21 20147.3 
JgsAn implementation of AdaBoost with vanilla Decision Trees. Rounds of boosting = 250Sun Jan 12 20:27:00 20147.3 
Hello!AdaBoost with Naive Bayes(200)Tue Jan 14 14:19:36 20147.3 
vluuAn attempt at AdaBoost with Naive BayesThu Dec 26 21:36:21 20137.5 
K.L.AdaBoost run with 1000 iterations.Tue Jan 7 15:33:41 20147.5 
LKAdaBoost using decision stumpFri Jan 3 04:41:17 20147.6 
0108Adaboost with decision stump as weak learnerSun Jan 12 13:35:33 20147.6 
KiwisAdaBoost with decision stumps and 150 iterations.Thu Jan 9 07:10:52 20147.7 
Mike Honcho 500Adaboost implementationTue Jan 14 12:28:11 20147.8 
sabardA decision tree learning algorithm with chi squared pruning (5%).Tue Jan 14 15:32:46 20147.8 
sabardA decision tree learning algorithm with chi squared pruning.Tue Jan 14 05:27:30 20147.9 
JgsAn implementation of AdaBoost with vanilla Decision Trees. Rounds of boosting = 150Sun Jan 12 22:08:45 20148.0 
GewangPredicts the classification label based on the k nearest neighborsThu Jan 2 12:51:48 20148.2 
KhoaAn algorithm that classifies.Tue Jan 14 14:16:06 20148.2 
haoyuRandom Forest with Decision TreeFri Dec 27 00:48:21 20138.5 
NYPurifies training set for decision tree (pruning alternative)Wed Jan 8 17:03:23 20148.5 
PandaBearAdaboost on decision stumps, 1000 roundsThu Jan 9 18:11:53 20148.5 
Andra Constantinescu and Bar ShabtaiAdaBoost with decision stump as the weak learner.Number of iterations of AdaBoost optimized per example.Tue Jan 14 02:55:45 20148.5 
corgi2.0AdaBoost 150 w/ basic decision stumpsSat Jan 11 23:14:44 20148.8 
Nihar the GodUses Adaboost with decision stumps as weak learners and then uses 150 rounds of boostingTue Jan 14 15:07:58 20148.8 
Lil ThugA simple decision tree algorithm with chi-squared pruning.Wed Jan 8 15:12:38 20148.9 
Shaheed ChaganiAdaBoostMon Jan 13 21:28:03 20149.0 
L.M.K-nearestThu Jan 9 03:40:10 20149.2 
Nihar the GodUses Adaboost with decision stumps as weak learners and then uses 200 rounds of boostingTue Jan 14 15:12:20 20149.5 
anon5An implementation of a decision-tree-learning algorithm with pruningFri Jan 3 23:11:54 201410.0 
JgsAn implementation of AdaBoost with Decision Stumps that is optimized by only using the best possible decision stump for each attribute. Rounds of boosting = 20Sat Jan 11 13:50:02 201410.1 
anon5An implementation of vanilla decision-tree-learningFri Jan 3 23:15:26 201410.3 
Bob DonderoAdaboost (200 rounds) with weak learner as a decision tree (max depth 5) and chi-squared pruning (1%).Thu Jan 9 20:54:23 201410.3 
NYDecision TreeSun Jan 12 15:34:38 201410.3 
Mike Honcho 100Adaboost implementationTue Jan 14 12:25:16 201410.3 
Shaheed ChaganiAdaBoostMon Jan 13 20:41:05 201410.4 
Ameera and DavidDecision Tree Learning algorithm implementation.Sun Jan 12 22:04:52 201410.5 
CAPSLOCKA mostly vanilla decision tree. Uses some cool data structures though.Thu Jan 9 22:57:33 201410.8 
Shaheed ChaganiNaive Bayes ClassifierWed Dec 18 07:43:08 201311.3 
weezyImplements a k-Nearest Neighbor algorithm with k = 15.Thu Jan 9 20:02:27 201411.5 
Aaron DollThis is an implementation of the random forest algorithmThu Jan 9 18:17:46 201411.8 
Jake BarnesMultiple layer artificial neural network (5 hidden nodes) with 125 rounds of training. Learning rate is 0.1Mon Jan 13 17:03:22 201412.1 
Bob DonderoA decision tree learning algorithm using information gain and chi-squared pruning.Thu Jan 9 20:39:27 201412.7 
John WhelchelBasic implementation of AdaBoost using decision stumps as weak learners and 100 rounds of boosting.Sun Jan 12 11:32:16 201412.9 
John WhelchelBasic implementation of AdaBoost using decision stumps as weak learners and 150 rounds of boosting.Sun Jan 12 20:29:33 201412.9 
Bob DonderoAdaboost (200 rounds) with weak learner as a decision tree (max depth 5) and chi-squared pruning (1%)Fri Jan 10 23:04:11 201413.0 
weezyImplements a k-Nearest Neighbor algorithm with k = 27.Sat Jan 11 00:38:42 201413.1 
Mike HonchoAdaboost implementationTue Jan 14 12:23:10 201413.1 
CTTTDecision Tree Algorithm with Chi-Squared Pre-PruningMon Jan 6 20:48:45 201413.3 
dmmckenn_pthorpeImplements Naive Bayes using discretization as opposed to continuous values.Thu Jan 9 21:21:27 201413.5 
akdoteNaive Bayes AlgorithmFri Jan 10 00:38:21 201413.5 
bcfourA Naive Bayes approach to classification.Sat Jan 4 22:45:40 201413.6 
Solving From ClassifierThe Naive Bayes algorithm executes the maximum-likelihood parameter learning problem and uses the learned parameters (obtained from observed attribute values) to find the maximum-likelihood naive Bayes hypothesis.Tue Jan 7 13:55:18 201413.6 
Mickey MouseAn implementation of Naive BayesTue Jan 7 15:50:35 201413.6 
bcfour,jkwokNaive Bayes with standard Laplacian correctionThu Jan 9 13:03:06 201413.6 
Hello!Naive BayesThu Jan 9 23:05:21 201413.6 
Solving From ClassifierThe Naive Bayes algorithm using a binary representation as opposed to a discrete representation.Sat Jan 11 21:42:29 201413.6 
Solving From ClassifierThe Naive Bayes algorithm using a binary representation at times an a discrete representation at other times.Sat Jan 11 21:58:17 201413.6 
SunnannaNaive Bayes Alogrithm using maximum likelihood estimatorMon Jan 13 18:54:30 201413.6 
akdoteNaive Bayes AlgorithmThu Jan 9 21:08:52 201413.7 
akdoteNaive Bayes AlgorithmThu Jan 9 23:45:31 201414.3 
akdoteNaive Bayes AlgorithmThu Jan 9 23:35:46 201414.7 
GodImplements naive Bayes algorithm.Tue Jan 14 00:19:12 201415.3 
Igor ZabukovecSVMThu Jan 9 15:15:12 201415.5 
Igor ZabukovecSVMTue Jan 14 11:10:59 201415.5 
T.C.Multi-layered Neural Net, 200 iterations, .1 learning rateThu Jan 9 03:05:39 201415.8 
0108Adaboost with decision stump as weak learnerWed Jan 8 18:34:30 201416.1 
vvsprAn implementation of the Naive Bayes AlgorithmTue Dec 31 13:37:34 201316.9 
CTTTA decision stump weak learner.Mon Jan 6 22:58:43 201417.2 
Mickey MouseAn implementation of Decision Stump AlgorithmTue Jan 7 15:47:47 201417.2 
SquirtleAn implementation of the vanillia decisionstumps classifierThu Jan 9 14:27:10 201417.2 
CharliezscDecision Stump without boostingThu Jan 9 19:19:26 201417.2 
CharliezscBagging with Decision Stumps (200 weak learners and half bootstrap samples)Thu Jan 9 20:08:14 201417.2 
CharliezscBagging with Decision Stumps (200 weak learners and 2 percent bootstrap samples)Thu Jan 9 20:49:51 201417.2 
dmmckenn_pthorpeBasic Stumps ImplementationThu Jan 9 21:12:17 201417.2 
0108Bagging with decision stump as weak learnerSun Jan 12 14:19:36 201417.2 
PandaBearAdaboost on decision stumps, 500 roundsMon Jan 13 22:01:40 201417.2 
PandaBearAdaboost on decision stumps, 1000 roundsTue Jan 14 11:09:57 201417.2 
GodImplements Basic Decision Stumps and chooses the one which performs the bestTue Jan 14 14:35:40 201417.2 
LearnerImplementation of Adaboost with decision stumps as the weak learner.Tue Jan 14 15:17:02 201417.2 
LearnerImplementation of Adaboost with decision stumps as the weak learner. 100 rounds of boosting.Tue Jan 14 15:33:13 201417.2 
sabardDecision Stump weak learning algorithm to be used with AdaBoostTue Jan 14 15:39:42 201417.2 
Shaheed ChaganiNaive Bayes ClassifierSun Jan 12 22:51:00 201419.2 
Shaheed ChaganiNaive Bayes ClassifierMon Jan 13 14:06:47 201419.8 
Wu-Tang DynastyAdaBoost using random sampling and Decision TreesMon Jan 13 07:30:13 201420.2 
Linda ZhongBasic decision tree algorithm implementation , no pruning.Sun Jan 12 16:29:37 201420.5 
Catherine Wu and Yan WuAdaBoost using random sampling and Decision TreesWed Jan 8 22:51:13 201420.7 
K.L.Decision StumpsTue Jan 7 15:29:40 201423.5 
Mike Honcho 10Adaboost implementationTue Jan 14 12:26:40 201425.3 
T.C.Multi-layered Neural Net, 100 iterations, .1 learning rateWed Jan 8 04:38:21 201425.8 
tenrburritoAdaBoost algorithm with Decision Trees as weak learning algorithmThu Jan 9 15:24:10 201428.9 
Joshua A. ZimmerA working (hopefully) attempt at a learning algorithm that uses weighting of the training examples via decision stumps to predict the classification of the test examples.Mon Jan 20 16:17:10 201433.1 
Wu-Tang DynastyAdaBoost using random sampling and Decision TreesSun Jan 12 22:48:11 201434.0 
Wu-Tang DynastyAdaBoost using random sampling and Decision TreesSat Jan 11 18:34:43 201435.3 
corgi3.0basic decision treeTue Jan 14 02:39:38 201440.4 
corgi4.0decision tree with chi squared pruningTue Jan 14 04:16:18 201440.4 
corgi4.0decision tree with chi squared pruningTue Jan 14 12:52:06 201440.4 
corgi5.0decision tree with chi squared pruningTue Jan 14 14:26:38 201440.4 
corgi3.0decision tree, discrete attribute splittingTue Jan 14 12:55:01 201440.5 
Joshua A. ZimmerA learning algorithm that uses weightingof the training examples via decision stumps to predict theclassification of the test examples.Tue Jan 14 16:00:25 201443.0 
ValyaImplements neural nets, much like thealgorithm used in W6, with alpha = 0.1Thu Jan 9 14:20:29 201448.8 
Linda ZhongBasic decision tree algorithm implementation , no pruning.Fri Jan 10 00:47:46 201449.0 
ValyaImplements a single layer neural net, much like thealgorithm used in W6, with alpha = 0.1Sun Jan 12 20:44:34 201449.0 
nullnullThu Jan 2 21:29:50 201449.7 
sabardA decision tree learning algorithm.Thu Jan 9 23:04:47 201449.7 
ASappNearest Neighbor Algorithm with k = 5. Normalizes using mean and standard deviation of each attribute.Thu Jan 9 23:48:40 201449.7 
KhoaAn algo based on decision stumpsThu Jan 9 23:57:24 201449.7 
GuessingMinimally outputs a result by applying a random function.Sat Jan 11 15:26:17 201449.7 
Andra Constantinescu and Bar ShabtaiVanilla single layer neural network algorithm. Takes binary input and loops through all training examples to update the weights of each attribute. Alpha and epochs optimized for each dataset.Tue Jan 14 02:29:11 201449.7 
Estranged EgomaniacAdaBoost with decision stumps. 250 rounds of boosting.Sun Jan 12 13:54:21 201450.2 
liltan decision stump implementationMon Jan 13 15:03:39 201450.3 
Joshua A. ZimmerA learning algorithm that uses weightingof the training examples via decision stumps to predict theclassification of the test examples.Fri Jan 10 04:11:05 201456.4 
corgibasic decision stumpThu Jan 9 23:35:40 201482.8 
Solving From ClassifierThe Naive Bayes algorithm executes the maximum-likelihood parameter learning problem and uses the learned parameters (obtained from observed attribute values) to find the maximum-likelihood naive Bayes hypothesis.Tue Jan 7 13:49:21 201486.4 
MariusAdaBoost using decision stumps as the weak-learning algorithms. It is run for 200 iterations.Thu Jan 9 20:13:44 201494.5 

Table generated: Mon Jan 20 16:17:18 2014